Video Processor Android Setup - React Native
VideoSDK allows you to add unique effects to your video stream before transmission during video calls with custom native video processing code. This guide offers step-by-step instructions for creating and implementing a custom video processor for frame manipulation on Android.
Installation
- To create a custom video processor, you first need to install the
react-native-webrtc
package in your React Native app using eithernpm
oryarn
.
- NPM
- YARN
npm i "@videosdk.live/react-native-webrtc"
yarn add "@videosdk.live/react-native-webrtc"
- To enable native processor development on Android, add the following dependency to your
build.gradle
file located at<project root>/android/build.gradle
:
build.gradle
dependencies {
implementation project(':rnwebrtc')
// other app dependencies
}
Step-by-Step Guide: Integrating Native Video Processor in React Native
Step 1: Create Your Own Processor Code
- First, create a class that implements
VideoFrameProcessorFactoryInterface
. Define thebuild()
function, which returns aVideoFrameProcessor
for processing each video frame. VideoFrameProcessor
is an interface that allows you to apply transformations or effects to each frame before it’s rendered or processed further. You can add custom logic here to manipulate the video frames as needed. Once the effect has been applied, ensure you return the processed frame.
- Kotlin
- Java
YourOwnBackgroundProcessor.kt
package com.yourOwnPackage
import live.videosdk.rnwebrtc.videoEffects.VideoFrameProcessor
import live.videosdk.rnwebrtc.videoEffects.VideoFrameProcessorFactoryInterface
import org.webrtc.SurfaceTextureHelper
import org.webrtc.TextureBufferImpl
import org.webrtc.VideoFrame
class YourOwnBackgroundProcessor() :
VideoFrameProcessorFactoryInterface {
// Override the build function to access VideoFrames
override fun build(): VideoFrameProcessor {
return VideoFrameProcessor { frame: VideoFrame?, textureHelper: SurfaceTextureHelper? ->
// Add your custom code here to apply the effect and return the processed VideoFrame
return@VideoFrameProcessor processedFrame
}
}
}
YourOwnBackgroundProcessor.java
package com.yourOwnPackage;
import live.videosdk.rnwebrtc.videoEffects.VideoFrameProcessor;
import live.videosdk.rnwebrtc.videoEffects.VideoFrameProcessorFactoryInterface;
import org.webrtc.SurfaceTextureHelper;
import org.webrtc.TextureBufferImpl;
import org.webrtc.VideoFrame;
public class YourOwnBackgroundProcessor implements VideoFrameProcessorFactoryInterface {
@Override
public VideoFrameProcessor build() {
return new VideoFrameProcessor() {
@Override
public VideoFrame process(VideoFrame frame, SurfaceTextureHelper textureHelper) {
// Add your custom code here to apply the effect and return the processed VideoFrame
return processedFrame;
}
};
}
}
Step 2: Register your Processor
- Next, create a class
VideoEffectModule
that extendsReactContextBaseJavaModule
. This class will be used in your React Native module to interact with the native Android code. - In this class, register your custom video processor using the
addProcessor()
method from theProcessorProvider
class. Provide aString
representing the unique processor name along with an instance of your processor to complete the registration. This processor name will be used later to apply the effects offered by your processor.
- Kotlin
- Java
VideoEffectModule.kt
package com.yourOwnPackage;
import com.facebook.react.ReactApplicationContext;
import com.facebook.react.ReactContextBaseJavaModule;
import com.facebook.react.bridge.ReactContextBaseJavaModule;
import com.facebook.react.bridge.ReactMethod;
class VideoEffectModule(reactContext: ReactApplicationContext) :
ReactContextBaseJavaModule(reactContext) {
@ReactMethod
fun registerProcessor(processorName: String) {
//Create an instance of your Processor
val bgProcessor = YourOwnBackgroundProcessor()
//Register your processor with a unique name and its instance
ProcessorProvider.addProcessor(processorName, bgProcessor)
}
companion object {
const val NAME = "VideoEffect"
}
}
VideoEffectModule.java
package com.yourOwnPackage;
import com.facebook.react.ReactApplicationContext;
import com.facebook.react.ReactContextBaseJavaModule;
import com.facebook.react.bridge.ReactContextBaseJavaModule;
import com.facebook.react.bridge.ReactMethod;
public class VideoEffectModule extends ReactContextBaseJavaModule {
public VideoEffectModule(ReactApplicationContext reactContext) {
super(reactContext);
}
@ReactMethod
public void registerProcessor(String processorName) {
// Create an instance of your Processor
YourOwnBackgroundProcessor bgProcessor = new YourOwnBackgroundProcessor();
// Register your processor with a unique name and its instance
ProcessorProvider.addProcessor(processorName, bgProcessor);
}
@Override
public String getName() {
return NAME;
}
public static final String NAME = "VideoEffect";
}
- Now, in your React Native app, you can register the processor using the module.
app.js
const {VideoEffectModule} = NativeModules;
function register() {
VideoEffectModule.registerProcessor("ProcessorName");
}
Step 3: Apply the Processor
- Once you have registered the processor, you can use it throughout the entire app lifecycle. To apply the effect provided by the processor, use the
applyVideoProcessor()
method from theVideoProcessor
class. This method requires the name of the processor that was used during registration.
app.js
import {
VideoProcessor
} from "@videosdk.live/react-native-webrtc";
function applyProcessor() {
VideoProcessor.applyVideoProcessor("ProcessorName");
}
Step 4: Remove the Processor
- You can remove the processor when you no longer need the effect. To do this, use the
removeVideoProcessor()
method from theVideoProcessor
class to eliminate the effect provided by the processor. This method will remove the effect of the currently active video processor.
app.js
import {
VideoProcessor
} from "@videosdk.live/react-native-webrtc";
function removeProcessor() {
VideoProcessor.removeVideoProcessor();
}
Got a Question? Ask us on discord