diff --git a/.claude/skills/upstream-sync.md b/.claude/skills/upstream-sync.md new file mode 100644 index 000000000..6cf6c08e4 --- /dev/null +++ b/.claude/skills/upstream-sync.md @@ -0,0 +1,141 @@ +# Upstream Sync Skill + +Sync fork with one or more upstream remotes via cherry-pick + merge-base advance. + +## When to use + +User says: "sync with upstream", "cherry-pick from X", "merge upstream", or similar. + +## Workflow + +### Phase 1: Explore divergence + +```bash +git fetch +git log --oneline --right-only .../master --no-merges +``` + +For each upstream-only commit, get files changed: + +```bash +git log --right-only .../master --no-merges --format="%h %s" | while read hash msg; do + echo "=== $hash $msg ==="; git diff-tree --no-commit-id --name-only -r $hash; echo +done +``` + +### Phase 2: Triage commits + +| Category | Action | +|----------|--------| +| Already in fork | SKIP | +| Release/version bumps | SKIP | +| Lock file only | SKIP | +| Native WebRTC lib version changes | SKIP (fork uses StreamWebRTC) | +| Merge commits | SKIP | +| Cosmetic formatting | SKIP (run formatters separately) | +| Bug fixes | CHERRY-PICK | +| New features | CHERRY-PICK (ask user) | +| Refactoring | CHERRY-PICK (evaluate risk) | +| Docs/CI/tools | Ask user | + +Check for equivalents: `git log --oneline --left-only .../master | grep -i ""` + +### Phase 3: Ask user + +Present triage. Ask about large/risky features, optional items, anything ambiguous. + +### Phase 4: Cherry-pick in order + +```bash +git checkout -b sync/upstream-cherry-picks +``` + +Order: TS fixes → Android fixes → iOS fixes → small features → large features → docs. + +If conflict: resolve, `git add`, `git cherry-pick --continue --no-edit`. +If empty after resolution: `git cherry-pick --skip`. + +### Phase 5: Merge to advance merge-base + +Without this, future merges replay ALL upstream commits including skipped ones. + +```bash +git merge /master --no-commit + +# Conflicted files — keep ours +git diff --name-only --diff-filter=U | xargs git checkout --ours +# Files deleted in our branch — remove +git rm +# Auto-merged files — reset to our version +git diff --cached --name-only --diff-filter=M | xargs git checkout HEAD -- +# Unwanted new files from upstream — remove +git diff --cached --name-only --diff-filter=A # review, then: +git rm -f + +git add -A +git diff --cached --stat HEAD # should be empty or near-empty +git commit -m "merge: sync merge-base with /master" +``` + +Verify: `git log --oneline --right-only .../master | wc -l` should be `0`. + +### Phase 6: Verify + +Run ALL of these. Do not skip any. + +```bash +npm run lint +cd examples/GumTestApp/android && ./gradlew assembleDebug +cd examples/GumTestApp/ios && pod install && \ + xcodebuild -workspace GumTestApp.xcworkspace -scheme GumTestApp \ + -sdk iphonesimulator -configuration Debug build +``` + +### Phase 7: Format native files + +```bash +git ls-files | grep -e "\(\.java\|\.h\|\.m\)$" | grep -v examples | xargs npx clang-format -i +``` + +Rebuild Android + iOS to confirm, then commit. + +### Phase 8: Update package-lock.json + +If `package.json` dependencies changed, lock file will be stale. + +```bash +npm install +git add package-lock.json && git commit -m "chore: update package-lock.json" +``` + +## Preservation rules + +These MUST NOT change during sync: + +| File | Guard | +|------|-------| +| `android/build.gradle` | Must keep `io.getstream:stream-video-webrtc-android:*` | +| `stream-react-native-webrtc.podspec` | Must keep `StreamWebRTC` dependency | +| `ios/RCTWebRTC/Utils/AudioDeviceModule/` | Fork's custom audio engine — untouched | +| `SpeechActivityDetector.java` | Fork's custom VAD — untouched | +| `AudioDeviceModule.ts`, `AudioDeviceModuleEvents.ts` | Fork's custom TS APIs — untouched | + +Post-sync: `grep -r "org.webrtc:google-webrtc\|webrtc-ios" --include="*.gradle" --include="*.podspec" .` must return nothing. + +## Pitfalls + +1. **Always run native builds, not just tsc.** Cherry-picks can pass tsc but fail gradlew/xcodebuild. + +2. **Native API names differ across WebRTC versions.** Enum values, type names, and method signatures may not exist in our WebRTC SDK. After cherry-picking from a fork on a different WebRTC version, verify types exist before building. + +3. **`git add -A` re-adds files you removed.** Use `git rm -f` (not `--cached`) to remove from both index and disk. + +4. **Auto-merged files need `git checkout HEAD --`, not `--ours`.** `--ours` only works on conflicted files. For auto-merged files with unwanted changes, use `git checkout HEAD -- `. + +5. **Watch for duplicates after conflict resolution.** Duplicate variable declarations, closing braces, or imports when keeping both sides of a conflict. + +6. **Advance merge-base for EVERY upstream remote.** If syncing with multiple upstreams, merge each one separately. Otherwise the un-advanced remote replays all its history on the next merge. + +7. **Upstream podspec/build files leak into cherry-picks.** Other forks have their own podspec (e.g., `livekit-react-native-webrtc.podspec`). Always `git rm` them when they appear. + +8. **Cross-check cherry-picks against all upstreams for reverts.** Before cherry-picking a commit from one upstream, search the other upstreams for the same change — it may have been tried and reverted. Run: `git log --all --oneline -S ""` to find if the same change exists elsewhere in history with a subsequent revert. diff --git a/.eslintignore b/.eslintignore index 8fced415a..81234885b 100644 --- a/.eslintignore +++ b/.eslintignore @@ -1,5 +1,6 @@ examples lib +src/vendor tools metro.*.js diff --git a/Documentation/AndroidInstallation.md b/Documentation/AndroidInstallation.md index 0b00c5b90..8da1b60ee 100644 --- a/Documentation/AndroidInstallation.md +++ b/Documentation/AndroidInstallation.md @@ -72,6 +72,55 @@ In `android/app/proguard-rules.pro` add the following on a new line. -keep class org.webrtc.** { *; } ``` +## Set audio category (output) to media + +The audio is considered calls by default. If you don't want your audio to be treated as a call stream you need to change the category. To set the category: + +if your Android files are written in Java, modify `MainApplication.java`: +```java +// add imports +import com.oney.WebRTCModule.WebRTCModuleOptions; +import android.media.AudioAttributes; +import org.webrtc.audio.JavaAudioDeviceModule; + +public class MainApplication extends Application implements ReactApplication { + @Override + public void onCreate() { + // append this before WebRTCModule initializes + WebRTCModuleOptions options = WebRTCModuleOptions.getInstance(); + AudioAttributes audioAttributes = AudioAttributes.Builder() + .setUsage(AudioAttributes.USAGE_MEDIA) + .setContentType(AudioAttributes.CONTENT_TYPE_SPEECH) + .build(); + options.audioDeviceModule = JavaAudioDeviceModule.builder(this) + .setAudioAttributes(audioAttributes) + .createAudioDeviceModule(); + } +} +``` + +if your Android files are written in Kotlin, modify `MainApplication.kt`: +```kt +// add imports +import com.oney.WebRTCModule.WebRTCModuleOptions; +import android.media.AudioAttributes +import org.webrtc.audio.JavaAudioDeviceModule; + +class MainApplication : Application(), ReactApplication { + override fun onCreate() { + // append this before WebRTCModule initializes + val options = WebRTCModuleOptions.getInstance() + val audioAttributes = AudioAttributes.Builder() + .setUsage(AudioAttributes.USAGE_MEDIA) + .setContentType(AudioAttributes.CONTENT_TYPE_SPEECH) + .build() + options.audioDeviceModule = JavaAudioDeviceModule.builder(this) + .setAudioAttributes(audioAttributes) + .createAudioDeviceModule() + } +} +``` + ## Fatal Exception: java.lang.UnsatisfiedLinkError ``` diff --git a/Documentation/BasicUsage.md b/Documentation/BasicUsage.md index 68d65c281..2a645685b 100644 --- a/Documentation/BasicUsage.md +++ b/Documentation/BasicUsage.md @@ -112,6 +112,22 @@ try { }; ``` +### Using Media Constraints on getDisplayMedia (Android Only) + +It is possible to use mediaConstraints on getDisplayMedia to restricts the user to capturing the default display using the custom boolean parameter `createConfigForDefaultDisplay`. +A resolution scale can also be applied using `resolutionScale` parameter. Value is a number between 0 and 1. + +This configuration in only available for android, so will you have to add the 'android' key in constraints. + +```javascript + const displayMediaStreamConstraints = { + android: { + createConfigForDefaultDisplay: true, + resolutionScale: 0.5, + }, + }; +``` + ## Destroying the Media Stream Cycling all of the tracks and stopping them is more than enough to clean up after a call has finished. @@ -232,11 +248,9 @@ That will allow you to enable and disable video streams on demand while a call i ```javascript let sessionConstraints = { - mandatory: { - OfferToReceiveAudio: true, - OfferToReceiveVideo: true, - VoiceActivityDetection: true - } + offerToReceiveAudio: true, + offerToReceiveVideo: true, + voiceActivityDetection: true }; ``` diff --git a/Documentation/CallGuide.md b/Documentation/CallGuide.md index 5e4a86cc1..0b5d56848 100644 --- a/Documentation/CallGuide.md +++ b/Documentation/CallGuide.md @@ -143,11 +143,9 @@ So you can now start creating an offer which then needs sending send off to the ```javascript let sessionConstraints = { - mandatory: { - OfferToReceiveAudio: true, - OfferToReceiveVideo: true, - VoiceActivityDetection: true - } + offerToReceiveAudio: true, + offerToReceiveVideo: true, + voiceActivityDetection: true }; try { diff --git a/README.md b/README.md index 9ef908431..ebf6eb56b 100644 --- a/README.md +++ b/README.md @@ -75,6 +75,10 @@ Don't worry, there are plans to include a much more broader example with backend Come join our [Discourse Community](https://react-native-webrtc.discourse.group/) if you want to discuss any React Native and WebRTC related topics. Everyone is welcome and every little helps. +## Picture-in-Picture (PIP) + +This package does not include a built-in PIP implementation. PIP support is available via [`@stream-io/video-react-native-sdk`](https://github.com/GetStream/stream-video-js). + ## Related Projects Looking for extra functionality coverage? diff --git a/android/src/main/java/com/oney/WebRTCModule/AbstractVideoCaptureController.java b/android/src/main/java/com/oney/WebRTCModule/AbstractVideoCaptureController.java index 06742b946..cf8fe6c83 100644 --- a/android/src/main/java/com/oney/WebRTCModule/AbstractVideoCaptureController.java +++ b/android/src/main/java/com/oney/WebRTCModule/AbstractVideoCaptureController.java @@ -97,7 +97,8 @@ public boolean stopCapture() { public void applyConstraints(ReadableMap constraints, @Nullable Consumer onFinishedCallback) { if (onFinishedCallback != null) { - onFinishedCallback.accept(new UnsupportedOperationException("This video track does not support applyConstraints.")); + onFinishedCallback.accept( + new UnsupportedOperationException("This video track does not support applyConstraints.")); } } diff --git a/android/src/main/java/com/oney/WebRTCModule/CameraCaptureController.java b/android/src/main/java/com/oney/WebRTCModule/CameraCaptureController.java index a1549bff1..939d79350 100644 --- a/android/src/main/java/com/oney/WebRTCModule/CameraCaptureController.java +++ b/android/src/main/java/com/oney/WebRTCModule/CameraCaptureController.java @@ -3,7 +3,7 @@ import android.content.Context; import android.hardware.camera2.CameraManager; import android.util.Log; -import android.util.Pair; + import androidx.annotation.Nullable; import androidx.core.util.Consumer; @@ -39,6 +39,9 @@ public class CameraCaptureController extends AbstractVideoCaptureController { private final Context context; private final CameraEnumerator cameraEnumerator; + + private final String constraintDeviceId; + private final String constraintFacingMode; private ReadableMap constraints; /** @@ -46,7 +49,6 @@ public class CameraCaptureController extends AbstractVideoCaptureController { * {@link CameraEnumerator#createCapturer}. */ private final CameraEventsHandler cameraEventsHandler = new CameraEventsHandler() { - @Override public void onCameraOpening(String cameraName) { super.onCameraOpening(cameraName); @@ -62,6 +64,9 @@ public CameraCaptureController(Context context, CameraEnumerator cameraEnumerato this.context = context; this.cameraEnumerator = cameraEnumerator; this.constraints = constraints; + + this.constraintDeviceId = ReactBridgeUtil.getMapStrValue(this.constraints, "deviceId"); + this.constraintFacingMode = ReactBridgeUtil.getMapStrValue(this.constraints, "facingMode"); } @Nullable @@ -113,8 +118,10 @@ public void applyConstraints(ReadableMap constraints, @Nullable Consumer { saveConstraints.run(); - if (targetWidth != oldTargetWidth || - targetHeight != oldTargetHeight || - targetFps != oldTargetFps) { + if (targetWidth != oldTargetWidth || targetHeight != oldTargetHeight || targetFps != oldTargetFps) { updateActualSize(finalCameraIndex, finalCameraName, videoCapturer); capturer.changeCaptureFormat(targetWidth, targetHeight, targetFps); } @@ -200,11 +205,8 @@ public void onCameraSwitchError(String s) { @Override protected VideoCapturer createVideoCapturer() { - String deviceId = ReactBridgeUtil.getMapStrValue(this.constraints, "deviceId"); - String facingMode = ReactBridgeUtil.getMapStrValue(this.constraints, "facingMode"); - - CreateCapturerResult result = createVideoCapturer(deviceId, facingMode); - if(result == null) { + CreateCapturerResult result = createVideoCapturer(constraintDeviceId, constraintFacingMode); + if (result == null) { return null; } @@ -233,12 +235,12 @@ private void updateActualSize(int cameraIndex, String cameraName, VideoCapturer * Constructs a new {@code VideoCapturer} instance attempting to satisfy * specific constraints. * - * @param deviceId the ID of the requested video device. If not - * {@code null} and a {@code VideoCapturer} can be created for it, then - * {@code facingMode} is ignored. + * @param deviceId the ID of the requested video device. If not + * {@code null} and a {@code VideoCapturer} can be created for it, then + * {@code facingMode} is ignored. * @param facingMode the facing of the requested video source such as - * {@code user} and {@code environment}. If {@code null}, "user" is - * presumed. + * {@code user} and {@code environment}. If {@code null}, "user" is + * presumed. * @return a pair containing the deviceId and {@code VideoCapturer} satisfying the {@code facingMode} or * {@code deviceId} constraint, or null. */ @@ -325,7 +327,7 @@ private static class CreateCapturerResult { public final int cameraIndex; public final String cameraName; public final VideoCapturer videoCapturer; - + public CreateCapturerResult(int cameraIndex, String cameraName, VideoCapturer videoCapturer) { this.cameraIndex = cameraIndex; this.cameraName = cameraName; diff --git a/android/src/main/java/com/oney/WebRTCModule/DataPacketCryptorManager.java b/android/src/main/java/com/oney/WebRTCModule/DataPacketCryptorManager.java new file mode 100644 index 000000000..1252f7a9d --- /dev/null +++ b/android/src/main/java/com/oney/WebRTCModule/DataPacketCryptorManager.java @@ -0,0 +1,63 @@ +package com.oney.WebRTCModule; + +import android.util.Log; + +import org.webrtc.DataPacketCryptor; +import org.webrtc.DataPacketCryptorFactory; +import org.webrtc.FrameCryptorAlgorithm; +import org.webrtc.FrameCryptorKeyProvider; + +import javax.annotation.Nullable; + +public class DataPacketCryptorManager { + private static final String TAG = DataPacketCryptorManager.class.getSimpleName(); + private final DataPacketCryptor dataPacketCryptor; + private boolean isDisposed = false; + + public DataPacketCryptorManager(FrameCryptorAlgorithm algorithm, FrameCryptorKeyProvider keyProvider) { + dataPacketCryptor = + DataPacketCryptorFactory.createDataPacketCryptor(FrameCryptorAlgorithm.AES_GCM, keyProvider); + } + + @Nullable + public synchronized DataPacketCryptor.EncryptedPacket encrypt(String participantId, int keyIndex, byte[] payload) { + if (isDisposed) { + return null; + } + + DataPacketCryptor.EncryptedPacket packet = dataPacketCryptor.encrypt(participantId, keyIndex, payload); + + if (packet == null) { + Log.i(TAG, "Error encrypting packet: null packet"); + return null; + } + + if (packet.payload == null) { + Log.i(TAG, "Error encrypting packet: null payload"); + return null; + } + if (packet.iv == null) { + Log.i(TAG, "Error encrypting packet: null iv returned"); + return null; + } + + return packet; + } + + @Nullable + public synchronized byte[] decrypt(String participantId, DataPacketCryptor.EncryptedPacket packet) { + if (isDisposed) { + return null; + } + + return dataPacketCryptor.decrypt(participantId, packet); + } + + public synchronized void dispose() { + if (isDisposed) { + return; + } + isDisposed = true; + dataPacketCryptor.dispose(); + } +} \ No newline at end of file diff --git a/android/src/main/java/com/oney/WebRTCModule/GetUserMediaImpl.java b/android/src/main/java/com/oney/WebRTCModule/GetUserMediaImpl.java index 731ee8e5c..60856c7c6 100644 --- a/android/src/main/java/com/oney/WebRTCModule/GetUserMediaImpl.java +++ b/android/src/main/java/com/oney/WebRTCModule/GetUserMediaImpl.java @@ -5,10 +5,13 @@ import android.content.Context; import android.content.Intent; import android.content.ServiceConnection; +import android.media.projection.MediaProjectionConfig; import android.media.projection.MediaProjectionManager; +import android.os.Build; import android.os.IBinder; import android.util.DisplayMetrics; import android.util.Log; + import androidx.core.util.Consumer; import com.facebook.react.bridge.Arguments; @@ -18,6 +21,7 @@ import com.facebook.react.bridge.ReactApplicationContext; import com.facebook.react.bridge.ReadableArray; import com.facebook.react.bridge.ReadableMap; +import com.facebook.react.bridge.ReadableType; import com.facebook.react.bridge.UiThreadUtil; import com.facebook.react.bridge.WritableArray; import com.facebook.react.bridge.WritableMap; @@ -61,6 +65,8 @@ public class GetUserMediaImpl { private Promise displayMediaPromise; private Intent mediaProjectionPermissionResultData; + private boolean createConfigForDefaultDisplay = false; + private float resolutionScale = 1.0f; /** * Returns the MediaProjection permission result data Intent. @@ -76,9 +82,7 @@ public Intent getMediaProjectionPermissionResultData() { public void onServiceConnected(ComponentName name, IBinder service) { // Service is now bound, you can call createScreenStream() Log.d(TAG, "MediaProjectionService bound, creating screen stream."); - ThreadUtils.runOnExecutor(() -> { - createScreenStream(); - }); + ThreadUtils.runOnExecutor(() -> { createScreenStream(); }); } @Override @@ -105,7 +109,6 @@ public void onActivityResult(Activity activity, int requestCode, int resultCode, mediaProjectionPermissionResultData = data; MediaProjectionService.launch(activity, mediaProjectionServiceConnection); - } } }); @@ -306,10 +309,11 @@ void disposeTrack(String id) { void applyConstraints(String trackId, ReadableMap constraints, Promise promise) { TrackPrivate track = tracks.get(trackId); if (track != null && track.videoCaptureController instanceof AbstractVideoCaptureController) { - AbstractVideoCaptureController captureController = (AbstractVideoCaptureController) track.videoCaptureController; + AbstractVideoCaptureController captureController = + (AbstractVideoCaptureController) track.videoCaptureController; captureController.applyConstraints(constraints, new Consumer() { public void accept(Exception e) { - if(e != null) { + if (e != null) { promise.reject(e); return; } @@ -322,7 +326,41 @@ public void accept(Exception e) { } } - void getDisplayMedia(Promise promise) { + void initializeConstraints(ReadableMap constraints) { + // Handle the incoming params + + ReadableMap androidConstraints = null; + if (constraints.hasKey("android") && constraints.getType("android") == ReadableType.Map) { + androidConstraints = constraints.getMap("android"); + } + + // Default values + boolean createConfigForDefaultDisplay = false; + float scale = 1.0f; + + if (androidConstraints != null) { + // MediaProjectionConfig need API level 34 + if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.UPSIDE_DOWN_CAKE + && androidConstraints.hasKey("createConfigForDefaultDisplay") + && androidConstraints.getType("createConfigForDefaultDisplay") == ReadableType.Boolean) { + createConfigForDefaultDisplay = androidConstraints.getBoolean("createConfigForDefaultDisplay"); + } + if (androidConstraints.hasKey("resolutionScale") + && androidConstraints.getType("resolutionScale") == ReadableType.Number) { + scale = (float) androidConstraints.getDouble("resolutionScale"); + } + } + + this.createConfigForDefaultDisplay = createConfigForDefaultDisplay; + // Force the value in [0, 1] + this.resolutionScale = Math.max(0.0f, Math.min(1.0f, scale)); + + Log.d(TAG, + "initializeConstraints: createConfigForDefaultDisplay=" + this.createConfigForDefaultDisplay + + " resolutionScale=" + this.resolutionScale); + } + + void getDisplayMedia(final ReadableMap constraints, Promise promise) { if (this.displayMediaPromise != null) { promise.reject(new RuntimeException("Another operation is pending.")); return; @@ -334,6 +372,8 @@ void getDisplayMedia(Promise promise) { return; } + this.initializeConstraints(constraints); + this.displayMediaPromise = promise; MediaProjectionManager mediaProjectionManager = @@ -344,8 +384,18 @@ void getDisplayMedia(Promise promise) { UiThreadUtil.runOnUiThread(new Runnable() { @Override public void run() { - currentActivity.startActivityForResult( - mediaProjectionManager.createScreenCaptureIntent(), PERMISSION_REQUEST_CODE); + if (createConfigForDefaultDisplay == true) { + // MediaProjectionConfig need API level 34 + // Return mediaProjection which restricts the user to capturing the default display + currentActivity.startActivityForResult( + mediaProjectionManager.createScreenCaptureIntent( + MediaProjectionConfig.createConfigForDefaultDisplay()), + PERMISSION_REQUEST_CODE); + } else { + // Return mediaProjection which allows the user to decide which region is captured + currentActivity.startActivityForResult( + mediaProjectionManager.createScreenCaptureIntent(), PERMISSION_REQUEST_CODE); + } } }); @@ -442,7 +492,7 @@ private VideoTrack createScreenTrack() { int width = displayMetrics.widthPixels; int height = displayMetrics.heightPixels; ScreenCaptureController screenCaptureController = new ScreenCaptureController( - reactContext.getCurrentActivity(), width, height, mediaProjectionPermissionResultData); + reactContext.getCurrentActivity(), width, height, mediaProjectionPermissionResultData, resolutionScale); return createVideoTrack(screenCaptureController); } @@ -478,7 +528,8 @@ VideoTrack createVideoTrack(AbstractVideoCaptureController videoCaptureControlle localTrackAdapter.addDimensionDetector(track); track.setEnabled(true); - tracks.put(id, new TrackPrivate(track, videoSource, videoCaptureController, surfaceTextureHelper, localTrackAdapter)); + tracks.put(id, + new TrackPrivate(track, videoSource, videoCaptureController, surfaceTextureHelper, localTrackAdapter)); videoCaptureController.startCapture(); @@ -497,10 +548,10 @@ MediaStreamTrack cloneTrack(String trackId) { MediaStreamTrack nativeTrack = track.track; final MediaStreamTrack clonedNativeTrack; VideoTrackAdapter clonedVideoTrackAdapter = null; - + if (nativeTrack instanceof VideoTrack) { clonedNativeTrack = pcFactory.createVideoTrack(id, (VideoSource) track.mediaSource); - + // Create dimension detection for cloned video tracks clonedVideoTrackAdapter = new VideoTrackAdapter(webRTCModule, -1); clonedVideoTrackAdapter.addDimensionDetector((VideoTrack) clonedNativeTrack); @@ -509,13 +560,11 @@ MediaStreamTrack cloneTrack(String trackId) { } clonedNativeTrack.setEnabled(nativeTrack.enabled()); - final TrackPrivate clone = new TrackPrivate( - clonedNativeTrack, - track.mediaSource, - track.videoCaptureController, - track.surfaceTextureHelper, - clonedVideoTrackAdapter - ); + final TrackPrivate clone = new TrackPrivate(clonedNativeTrack, + track.mediaSource, + track.videoCaptureController, + track.surfaceTextureHelper, + clonedVideoTrackAdapter); clone.setParent(track); tracks.put(id, clone); @@ -536,20 +585,22 @@ void setVideoEffects(String trackId, ReadableArray names) { SurfaceTextureHelper surfaceTextureHelper = track.surfaceTextureHelper; if (names != null) { - List processors = names.toArrayList().stream() - .filter(name -> name instanceof String) - .map(name -> { - VideoFrameProcessor videoFrameProcessor = ProcessorProvider.getProcessor((String) name); - if (videoFrameProcessor == null) { - Log.e(TAG, "no videoFrameProcessor associated with this name: " + name); - } - return videoFrameProcessor; - }) - .filter(Objects::nonNull) - .collect(Collectors.toList()); - - VideoEffectProcessor videoEffectProcessor = - new VideoEffectProcessor(processors, surfaceTextureHelper); + List processors = + names.toArrayList() + .stream() + .filter(name -> name instanceof String) + .map(name -> { + VideoFrameProcessor videoFrameProcessor = + ProcessorProvider.getProcessor((String) name); + if (videoFrameProcessor == null) { + Log.e(TAG, "no videoFrameProcessor associated with this name: " + name); + } + return videoFrameProcessor; + }) + .filter(Objects::nonNull) + .collect(Collectors.toList()); + + VideoEffectProcessor videoEffectProcessor = new VideoEffectProcessor(processors, surfaceTextureHelper); videoSource.setVideoProcessor(videoEffectProcessor); } else { @@ -558,6 +609,15 @@ void setVideoEffects(String trackId, ReadableArray names) { } } + void registerTrack(AudioTrack track, AudioSource source) { + tracks.put(track.id(), new TrackPrivate(track, source, null, null)); + } + + void registerTrack(VideoTrack track, VideoSource source, AbstractVideoCaptureController controller, + SurfaceTextureHelper surfaceTextureHelper) { + tracks.put(track.id(), new TrackPrivate(track, source, controller, surfaceTextureHelper)); + } + /** * Application/library-specific private members of local * {@code MediaStreamTrack}s created by {@code GetUserMediaImpl}. @@ -669,5 +729,7 @@ public boolean isClone() { } } - public interface BiConsumer { void accept(T t, U u); } + public interface BiConsumer { + void accept(T t, U u); + } } diff --git a/android/src/main/java/com/oney/WebRTCModule/MediaProjectionNotification.java b/android/src/main/java/com/oney/WebRTCModule/MediaProjectionNotification.java index be6d05c3d..8075966d7 100644 --- a/android/src/main/java/com/oney/WebRTCModule/MediaProjectionNotification.java +++ b/android/src/main/java/com/oney/WebRTCModule/MediaProjectionNotification.java @@ -1,18 +1,14 @@ package com.oney.WebRTCModule; import android.app.Notification; -import android.app.Service; -import android.content.Context; - -import androidx.core.app.NotificationCompat; - import android.app.NotificationChannel; import android.app.NotificationManager; - +import android.app.Service; +import android.content.Context; import android.os.Build; - import android.util.Log; +import androidx.core.app.NotificationCompat; /** * Helper class for creating the media projection notification which is used with @@ -23,7 +19,6 @@ class MediaProjectionNotification { static final String ONGOING_CONFERENCE_CHANNEL_ID = "OngoingConferenceChannel"; - static void createNotificationChannel(Context context) { if (Build.VERSION.SDK_INT < Build.VERSION_CODES.O) { return; @@ -34,22 +29,19 @@ static void createNotificationChannel(Context context) { return; } - NotificationManager notificationManager - = (NotificationManager) context.getSystemService(Service.NOTIFICATION_SERVICE); + NotificationManager notificationManager = + (NotificationManager) context.getSystemService(Service.NOTIFICATION_SERVICE); - NotificationChannel channel - = notificationManager.getNotificationChannel(ONGOING_CONFERENCE_CHANNEL_ID); + NotificationChannel channel = notificationManager.getNotificationChannel(ONGOING_CONFERENCE_CHANNEL_ID); if (channel != null) { // The channel was already created, no need to do it again. return; } - channel = new NotificationChannel( - ONGOING_CONFERENCE_CHANNEL_ID, + channel = new NotificationChannel(ONGOING_CONFERENCE_CHANNEL_ID, context.getString(R.string.ongoing_notification_channel_name), - NotificationManager.IMPORTANCE_DEFAULT - ); + NotificationManager.IMPORTANCE_DEFAULT); channel.enableLights(false); channel.enableVibration(false); channel.setShowBadge(false); @@ -60,8 +52,7 @@ static void createNotificationChannel(Context context) { static Notification buildMediaProjectionNotification(Context context) { NotificationCompat.Builder builder = new NotificationCompat.Builder(context, ONGOING_CONFERENCE_CHANNEL_ID); - builder - .setContentTitle(context.getString(R.string.media_projection_notification_title)) + builder.setContentTitle(context.getString(R.string.media_projection_notification_title)) .setContentText(context.getString(R.string.media_projection_notification_text)) .setOngoing(true) .setSilent(true) @@ -69,7 +60,8 @@ static Notification buildMediaProjectionNotification(Context context) { .setAutoCancel(false) .setVisibility(NotificationCompat.VISIBILITY_PUBLIC) .setOnlyAlertOnce(true) - .setSmallIcon(context.getResources().getIdentifier("ic_notification", "drawable", context.getPackageName())) + .setSmallIcon( + context.getResources().getIdentifier("ic_notification", "drawable", context.getPackageName())) .setForegroundServiceBehavior(NotificationCompat.FOREGROUND_SERVICE_IMMEDIATE); return builder.build(); diff --git a/android/src/main/java/com/oney/WebRTCModule/MediaProjectionService.java b/android/src/main/java/com/oney/WebRTCModule/MediaProjectionService.java index 100a08625..622aef23e 100644 --- a/android/src/main/java/com/oney/WebRTCModule/MediaProjectionService.java +++ b/android/src/main/java/com/oney/WebRTCModule/MediaProjectionService.java @@ -15,7 +15,6 @@ import java.util.Random; - /** * This class implements an Android {@link Service}, a foreground one specifically, and it's * responsible for presenting an ongoing notification when a conference is in progress. @@ -81,7 +80,6 @@ public IBinder onBind(Intent intent) { @Override public int onStartCommand(Intent intent, int flags, int startId) { - Notification notification = MediaProjectionNotification.buildMediaProjectionNotification(this); final Random random = new Random(); diff --git a/android/src/main/java/com/oney/WebRTCModule/PeerConnectionObserver.java b/android/src/main/java/com/oney/WebRTCModule/PeerConnectionObserver.java index 6ba19a041..280568b24 100644 --- a/android/src/main/java/com/oney/WebRTCModule/PeerConnectionObserver.java +++ b/android/src/main/java/com/oney/WebRTCModule/PeerConnectionObserver.java @@ -11,6 +11,7 @@ import com.facebook.react.bridge.WritableArray; import com.facebook.react.bridge.WritableMap; +import org.webrtc.AudioTrack; import org.webrtc.DataChannel; import org.webrtc.IceCandidate; import org.webrtc.MediaStream; @@ -127,6 +128,20 @@ RtpTransceiver addTransceiver(MediaStreamTrack track, RtpTransceiver.RtpTranscei return peerConnection.addTransceiver(track, init); } + RtpReceiver getReceiver(String id) { + if (this.peerConnection == null) { + return null; + } + + for (RtpReceiver receiver : this.peerConnection.getReceivers()) { + if (receiver.id().equals(id)) { + return receiver; + } + } + + return null; + } + RtpSender getSender(String id) { if (this.peerConnection == null) { return null; @@ -262,7 +277,8 @@ public void receiverGetStats(String receiverId, Promise promise) { return; } - peerConnection.getStats(targetReceiver, rtcStatsReport -> promise.resolve(StringUtils.statsToJSON(rtcStatsReport))); + peerConnection.getStats( + targetReceiver, rtcStatsReport -> promise.resolve(StringUtils.statsToJSON(rtcStatsReport))); } public void senderGetStats(String senderId, Promise promise) { @@ -280,7 +296,8 @@ public void senderGetStats(String senderId, Promise promise) { return; } - peerConnection.getStats(targetSender, rtcStatsReport -> promise.resolve(StringUtils.statsToJSON(rtcStatsReport))); + peerConnection.getStats( + targetSender, rtcStatsReport -> promise.resolve(StringUtils.statsToJSON(rtcStatsReport))); } @Override @@ -445,6 +462,8 @@ public void onAddTrack(final RtpReceiver receiver, final MediaStream[] mediaStre if (track.kind().equals(MediaStreamTrack.VIDEO_TRACK_KIND)) { videoTrackAdapters.addAdapter((VideoTrack) track); videoTrackAdapters.addDimensionDetector((VideoTrack) track); + } else if (track.kind().equals(MediaStreamTrack.AUDIO_TRACK_KIND)) { + ((AudioTrack) track).setVolume(WebRTCModuleOptions.getInstance().defaultTrackVolume); } remoteTracks.put(track.id(), track); } diff --git a/android/src/main/java/com/oney/WebRTCModule/RTCCryptoManager.java b/android/src/main/java/com/oney/WebRTCModule/RTCCryptoManager.java new file mode 100644 index 000000000..22682b7de --- /dev/null +++ b/android/src/main/java/com/oney/WebRTCModule/RTCCryptoManager.java @@ -0,0 +1,487 @@ +package com.oney.WebRTCModule; + +import android.util.Base64; +import android.util.Log; + +import androidx.annotation.NonNull; +import androidx.annotation.Nullable; + +import com.facebook.react.bridge.Arguments; +import com.facebook.react.bridge.Promise; +import com.facebook.react.bridge.ReadableMap; +import com.facebook.react.bridge.WritableMap; + +import org.webrtc.DataPacketCryptor; +import org.webrtc.FrameCryptor; +import org.webrtc.FrameCryptorAlgorithm; +import org.webrtc.FrameCryptorFactory; +import org.webrtc.FrameCryptorKeyProvider; +import org.webrtc.RtpReceiver; +import org.webrtc.RtpSender; + +import java.nio.charset.StandardCharsets; +import java.util.HashMap; +import java.util.Map; +import java.util.Objects; +import java.util.UUID; + +public class RTCCryptoManager { + private static final String TAG = "RTCFrameCryptor"; + private final Map frameCryptos = new HashMap<>(); + private final Map frameCryptoObservers = new HashMap<>(); + private final Map keyProviders = new HashMap<>(); + private final Map dataPacketCryptors = new HashMap<>(); + private final WebRTCModule webRTCModule; + + public RTCCryptoManager(WebRTCModule webRTCModule) { + this.webRTCModule = webRTCModule; + } + + private void sendEvent(String eventName, WritableMap params) { + webRTCModule.sendEvent(eventName, params); + } + + class FrameCryptorStateObserver implements FrameCryptor.Observer { + public FrameCryptorStateObserver(String frameCryptorId) { + this.frameCryptorId = frameCryptorId; + } + + private final String frameCryptorId; + + private String frameCryptorErrorStateToString(FrameCryptor.FrameCryptionState state) { + switch (state) { + case NEW: + return "new"; + case OK: + return "ok"; + case DECRYPTIONFAILED: + return "decryptionFailed"; + case ENCRYPTIONFAILED: + return "encryptionFailed"; + case INTERNALERROR: + return "internalError"; + case KEYRATCHETED: + return "keyRatcheted"; + case MISSINGKEY: + return "missingKey"; + default: + throw new IllegalArgumentException("Unknown FrameCryptorErrorState: " + state); + } + } + + @Override + public void onFrameCryptionStateChanged(String participantId, FrameCryptor.FrameCryptionState state) { + WritableMap event = Arguments.createMap(); + event.putString("event", "frameCryptionStateChanged"); + event.putString("participantId", participantId); + event.putString("state", frameCryptorErrorStateToString(state)); + event.putString("frameCryptorId", frameCryptorId); + sendEvent("frameCryptionStateChanged", event); + } + } + // + + private FrameCryptorAlgorithm frameCryptorAlgorithmFromInt(int algorithm) { + // StreamWebRTC M137 only supports AES_GCM + return FrameCryptorAlgorithm.AES_GCM; + } + + public String frameCryptorFactoryCreateFrameCryptor(ReadableMap params) { + String keyProviderId = params.getString("keyProviderId"); + FrameCryptorKeyProvider keyProvider = keyProviders.get(keyProviderId); + if (keyProvider == null) { + Log.w(TAG, "frameCryptorFactoryCreateFrameCryptorFailed: keyProvider not found"); + return null; + } + int peerConnectionId = params.getInt("peerConnectionId"); + PeerConnectionObserver pco = webRTCModule.getPeerConnectionObserver(peerConnectionId); + if (pco == null) { + Log.w(TAG, "frameCryptorFactoryCreateFrameCryptorFailed: peerConnection not found"); + return null; + } + String participantId = params.getString("participantId"); + String type = params.getString("type"); + int algorithm = params.getInt("algorithm"); + String rtpSenderId = params.getString("rtpSenderId"); + String rtpReceiverId = params.getString("rtpReceiverId"); + + if (type == null || !(type.equals("sender") || type.equals("receiver"))) { + Log.w(TAG, "frameCryptorFactoryCreateFrameCryptorFailed: type must be sender or receiver"); + return null; + } else if (type.equals("sender")) { + RtpSender rtpSender = pco.getSender(rtpSenderId); + + FrameCryptor frameCryptor = FrameCryptorFactory.createFrameCryptorForRtpSender(webRTCModule.mFactory, + rtpSender, + participantId, + frameCryptorAlgorithmFromInt(algorithm), + keyProvider); + String frameCryptorId = UUID.randomUUID().toString(); + frameCryptos.put(frameCryptorId, frameCryptor); + FrameCryptorStateObserver observer = new FrameCryptorStateObserver(frameCryptorId); + frameCryptor.setObserver(observer); + frameCryptoObservers.put(frameCryptorId, observer); + + return frameCryptorId; + } else { + RtpReceiver rtpReceiver = pco.getReceiver(rtpReceiverId); + + FrameCryptor frameCryptor = FrameCryptorFactory.createFrameCryptorForRtpReceiver(webRTCModule.mFactory, + rtpReceiver, + participantId, + frameCryptorAlgorithmFromInt(algorithm), + keyProvider); + String frameCryptorId = UUID.randomUUID().toString(); + frameCryptos.put(frameCryptorId, frameCryptor); + FrameCryptorStateObserver observer = new FrameCryptorStateObserver(frameCryptorId); + frameCryptor.setObserver(observer); + frameCryptoObservers.put(frameCryptorId, observer); + + return frameCryptorId; + } + } + + public void frameCryptorSetKeyIndex(ReadableMap params, @NonNull Promise result) { + String frameCryptorId = params.getString("frameCryptorId"); + FrameCryptor frameCryptor = frameCryptos.get(frameCryptorId); + if (frameCryptor == null) { + result.reject("frameCryptorSetKeyIndexFailed", "frameCryptor not found", (Throwable) null); + return; + } + int keyIndex = params.getInt("keyIndex"); + frameCryptor.setKeyIndex(keyIndex); + WritableMap paramsResult = Arguments.createMap(); + paramsResult.putBoolean("result", true); + result.resolve(paramsResult); + } + + public void frameCryptorGetKeyIndex(ReadableMap params, @NonNull Promise result) { + String frameCryptorId = params.getString("frameCryptorId"); + FrameCryptor frameCryptor = frameCryptos.get(frameCryptorId); + if (frameCryptor == null) { + result.reject("frameCryptorGetKeyIndexFailed", "frameCryptor not found", (Throwable) null); + return; + } + int keyIndex = frameCryptor.getKeyIndex(); + WritableMap paramsResult = Arguments.createMap(); + paramsResult.putInt("keyIndex", keyIndex); + result.resolve(paramsResult); + } + + public void frameCryptorSetEnabled(ReadableMap params, @NonNull Promise result) { + String frameCryptorId = params.getString("frameCryptorId"); + FrameCryptor frameCryptor = frameCryptos.get(frameCryptorId); + if (frameCryptor == null) { + result.reject("frameCryptorSetEnabledFailed", "frameCryptor not found", (Throwable) null); + return; + } + boolean enabled = params.getBoolean("enabled"); + frameCryptor.setEnabled(enabled); + + WritableMap paramsResult = Arguments.createMap(); + paramsResult.putBoolean("result", enabled); + result.resolve(paramsResult); + } + + public void frameCryptorGetEnabled(ReadableMap params, @NonNull Promise result) { + String frameCryptorId = params.getString("frameCryptorId"); + FrameCryptor frameCryptor = frameCryptos.get(frameCryptorId); + if (frameCryptor == null) { + result.reject("frameCryptorGetEnabledFailed", "frameCryptor not found", (Throwable) null); + return; + } + boolean enabled = frameCryptor.isEnabled(); + WritableMap paramsResult = Arguments.createMap(); + paramsResult.putBoolean("enabled", enabled); + result.resolve(paramsResult); + } + + public void frameCryptorDispose(ReadableMap params, @NonNull Promise result) { + String frameCryptorId = params.getString("frameCryptorId"); + FrameCryptor frameCryptor = frameCryptos.get(frameCryptorId); + if (frameCryptor == null) { + result.reject("frameCryptorDisposeFailed", "frameCryptor not found", (Throwable) null); + return; + } + frameCryptor.dispose(); + frameCryptos.remove(frameCryptorId); + frameCryptoObservers.remove(frameCryptorId); + WritableMap paramsResult = Arguments.createMap(); + paramsResult.putString("result", "success"); + result.resolve(paramsResult); + } + + @Nullable + public String frameCryptorFactoryCreateKeyProvider(ReadableMap keyProviderOptions) { + String keyProviderId = UUID.randomUUID().toString(); + + if (keyProviderOptions == null) { + Log.w(TAG, "frameCryptorFactoryCreateKeyProvider: keyProviderOptions is null!"); + return null; + } + boolean sharedKey = keyProviderOptions.getBoolean("sharedKey"); + int ratchetWindowSize = keyProviderOptions.getInt("ratchetWindowSize"); + int failureTolerance = keyProviderOptions.getInt("failureTolerance"); + + byte[] ratchetSalt = getBytesFromMap(keyProviderOptions, "ratchetSalt", "ratchetSaltIsBase64"); + + byte[] uncryptedMagicBytes = new byte[0]; + if (keyProviderOptions.hasKey("uncryptedMagicBytes")) { + uncryptedMagicBytes = Base64.decode(keyProviderOptions.getString("uncryptedMagicBytes"), Base64.DEFAULT); + } + int keyRingSize = (int) keyProviderOptions.getInt("keyRingSize"); + boolean discardFrameWhenCryptorNotReady = + (boolean) keyProviderOptions.getBoolean("discardFrameWhenCryptorNotReady"); + FrameCryptorKeyProvider keyProvider = FrameCryptorFactory.createFrameCryptorKeyProvider(sharedKey, + ratchetSalt, + ratchetWindowSize, + uncryptedMagicBytes, + failureTolerance, + keyRingSize, + discardFrameWhenCryptorNotReady); + keyProviders.put(keyProviderId, keyProvider); + return keyProviderId; + } + + public void keyProviderSetSharedKey(ReadableMap params, @NonNull Promise result) { + String keyProviderId = params.getString("keyProviderId"); + FrameCryptorKeyProvider keyProvider = keyProviders.get(keyProviderId); + if (keyProvider == null) { + result.reject("keyProviderSetKeySharedFailed", "keyProvider not found", (Throwable) null); + return; + } + int keyIndex = params.getInt("keyIndex"); + byte[] key = getBytesFromMap(params, "key", "keyIsBase64"); + keyProvider.setSharedKey(keyIndex, key); + + WritableMap paramsResult = Arguments.createMap(); + paramsResult.putBoolean("result", true); + result.resolve(paramsResult); + } + + public void keyProviderRatchetSharedKey(ReadableMap params, @NonNull Promise result) { + String keyProviderId = params.getString("keyProviderId"); + FrameCryptorKeyProvider keyProvider = keyProviders.get(keyProviderId); + if (keyProvider == null) { + result.reject("keyProviderRatchetSharedKeyFailed", "keyProvider not found", (Throwable) null); + return; + } + int keyIndex = params.getInt("keyIndex"); + + byte[] newKey = keyProvider.ratchetSharedKey(keyIndex); + + WritableMap paramsResult = Arguments.createMap(); + paramsResult.putString("result", Base64.encodeToString(newKey, Base64.NO_WRAP)); + result.resolve(paramsResult); + } + + public void keyProviderExportSharedKey(ReadableMap params, @NonNull Promise result) { + String keyProviderId = params.getString("keyProviderId"); + FrameCryptorKeyProvider keyProvider = keyProviders.get(keyProviderId); + if (keyProvider == null) { + result.reject("keyProviderExportSharedKeyFailed", "keyProvider not found", (Throwable) null); + return; + } + int keyIndex = params.getInt("keyIndex"); + + byte[] key = keyProvider.exportSharedKey(keyIndex); + + WritableMap paramsResult = Arguments.createMap(); + paramsResult.putString("result", Base64.encodeToString(key, Base64.NO_WRAP)); + result.resolve(paramsResult); + } + + public void keyProviderSetKey(ReadableMap params, @NonNull Promise result) { + String keyProviderId = params.getString("keyProviderId"); + FrameCryptorKeyProvider keyProvider = keyProviders.get(keyProviderId); + if (keyProvider == null) { + result.reject("keyProviderSetKeyFailed", "keyProvider not found", (Throwable) null); + return; + } + int keyIndex = params.getInt("keyIndex"); + String participantId = params.getString("participantId"); + byte[] key = getBytesFromMap(params, "key", "keyIsBase64"); + keyProvider.setKey(participantId, keyIndex, key); + + WritableMap paramsResult = Arguments.createMap(); + paramsResult.putBoolean("result", true); + result.resolve(paramsResult); + } + + public void keyProviderRatchetKey(ReadableMap params, @NonNull Promise result) { + String keyProviderId = params.getString("keyProviderId"); + FrameCryptorKeyProvider keyProvider = keyProviders.get(keyProviderId); + if (keyProvider == null) { + result.reject("keyProviderSetKeysFailed", "keyProvider not found", (Throwable) null); + return; + } + String participantId = params.getString("participantId"); + int keyIndex = params.getInt("keyIndex"); + + byte[] newKey = keyProvider.ratchetKey(participantId, keyIndex); + + WritableMap paramsResult = Arguments.createMap(); + paramsResult.putString("result", Base64.encodeToString(newKey, Base64.NO_WRAP)); + result.resolve(paramsResult); + } + + public void keyProviderExportKey(ReadableMap params, @NonNull Promise result) { + String keyProviderId = params.getString("keyProviderId"); + FrameCryptorKeyProvider keyProvider = keyProviders.get(keyProviderId); + if (keyProvider == null) { + result.reject("keyProviderExportKeyFailed", "keyProvider not found", (Throwable) null); + return; + } + String participantId = params.getString("participantId"); + int keyIndex = params.getInt("keyIndex"); + + byte[] key = keyProvider.exportKey(participantId, keyIndex); + + WritableMap paramsResult = Arguments.createMap(); + paramsResult.putString("result", Base64.encodeToString(key, Base64.NO_WRAP)); + result.resolve(paramsResult); + } + + public void keyProviderSetSifTrailer(ReadableMap params, @NonNull Promise result) { + String keyProviderId = params.getString("keyProviderId"); + FrameCryptorKeyProvider keyProvider = keyProviders.get(keyProviderId); + if (keyProvider == null) { + result.reject("keyProviderSetSifTrailerFailed", "keyProvider not found", (Throwable) null); + return; + } + byte[] sifTrailer = Base64.decode(params.getString("sifTrailer"), Base64.NO_WRAP); + keyProvider.setSifTrailer(sifTrailer); + + WritableMap paramsResult = Arguments.createMap(); + paramsResult.putBoolean("result", true); + result.resolve(paramsResult); + } + + public void keyProviderDispose(ReadableMap params, @NonNull Promise result) { + String keyProviderId = params.getString("keyProviderId"); + FrameCryptorKeyProvider keyProvider = keyProviders.get(keyProviderId); + if (keyProvider == null) { + result.reject("keyProviderDisposeFailed", "keyProvider not found", (Throwable) null); + return; + } + keyProvider.dispose(); + keyProviders.remove(keyProviderId); + WritableMap paramsResult = Arguments.createMap(); + paramsResult.putString("result", "success"); + result.resolve(paramsResult); + } + + public void dataPacketCryptorFactoryCreateDataPacketCryptor(ReadableMap params, @NonNull Promise result) { + int algorithm = params.getInt("algorithm"); + String keyProviderId = params.getString("keyProviderId"); + + FrameCryptorKeyProvider keyProvider = keyProviders.get(keyProviderId); + if (keyProvider == null) { + result.reject( + "dataPacketCryptorFactoryCreateDataPacketCryptorFailed", "keyProvider not found", (Throwable) null); + return; + } + + DataPacketCryptorManager cryptor = + new DataPacketCryptorManager(frameCryptorAlgorithmFromInt(algorithm), keyProvider); + + String dataPacketCryptorId = UUID.randomUUID().toString(); + dataPacketCryptors.put(dataPacketCryptorId, cryptor); + + WritableMap paramsResult = Arguments.createMap(); + paramsResult.putString("dataPacketCryptorId", dataPacketCryptorId); + result.resolve(paramsResult); + } + + public void dataPacketCryptorEncrypt(ReadableMap params, @NonNull Promise result) { + String dataPacketCryptorId = params.getString("dataPacketCryptorId"); + String participantId = params.getString("participantId"); + int keyIndex = params.getInt("keyIndex"); + byte[] data = getBytesFromMap(params, "data", null); + + DataPacketCryptorManager cryptor = dataPacketCryptors.get(dataPacketCryptorId); + + if (cryptor == null) { + result.reject("dataPacketCryptorEncryptFailed", "data packet cryptor not found", (Throwable) null); + return; + } + + DataPacketCryptor.EncryptedPacket packet = cryptor.encrypt(participantId, keyIndex, data); + + if (packet == null) { + result.reject("dataPacketCryptorEncryptFailed", "null packet", (Throwable) null); + return; + } + + WritableMap paramsResult = Arguments.createMap(); + paramsResult.putString("payload", Base64.encodeToString(packet.payload, Base64.NO_WRAP)); + paramsResult.putString("iv", Base64.encodeToString(packet.iv, Base64.NO_WRAP)); + paramsResult.putInt("keyIndex", packet.keyIndex); + result.resolve(paramsResult); + } + + public void dataPacketCryptorDecrypt(ReadableMap params, @NonNull Promise result) { + String dataPacketCryptorId = params.getString("dataPacketCryptorId"); + String participantId = params.getString("participantId"); + int keyIndex = params.getInt("keyIndex"); + byte[] payload = getBytesFromMap(params, "payload", null); + byte[] iv = getBytesFromMap(params, "iv", null); + + DataPacketCryptorManager cryptor = dataPacketCryptors.get(dataPacketCryptorId); + + if (cryptor == null) { + result.reject("dataPacketCryptorDecryptFailed", "data packet cryptor not found", (Throwable) null); + return; + } + + DataPacketCryptor.EncryptedPacket packet = new DataPacketCryptor.EncryptedPacket(payload, iv, keyIndex); + + byte[] decryptedData = cryptor.decrypt(participantId, packet); + + if (decryptedData == null) { + result.reject("dataPacketCryptorDecryptFailed", "null decrypted data", (Throwable) null); + return; + } + + WritableMap paramsResult = Arguments.createMap(); + paramsResult.putString("data", Base64.encodeToString(decryptedData, Base64.NO_WRAP)); + result.resolve(paramsResult); + } + + public void dataPacketCryptorDispose(ReadableMap params, @NonNull Promise result) { + String dataPacketCryptorId = params.getString("dataPacketCryptorId"); + + DataPacketCryptorManager cryptor = dataPacketCryptors.get(dataPacketCryptorId); + + if (cryptor == null) { + result.reject("dataPacketCryptorDisposeFailed", "data packet cryptor not found", (Throwable) null); + return; + } + + cryptor.dispose(); + dataPacketCryptors.remove(dataPacketCryptorId); + WritableMap paramsResult = Arguments.createMap(); + paramsResult.putString("result", "success"); + + result.resolve(paramsResult); + } + + private byte[] getBytesFromMap(ReadableMap map, String key, @Nullable String isBase64Key) { + boolean isBase64; + + if (isBase64Key != null) { + isBase64 = map.getBoolean(isBase64Key); + } else { + isBase64 = true; + } + + byte[] bytes; + + if (isBase64) { + bytes = Base64.decode(map.getString(key), Base64.DEFAULT); + } else { + bytes = Objects.requireNonNull(map.getString(key)).getBytes(StandardCharsets.UTF_8); + } + return bytes; + } +} diff --git a/android/src/main/java/com/oney/WebRTCModule/ScreenCaptureController.java b/android/src/main/java/com/oney/WebRTCModule/ScreenCaptureController.java index 200fdf33e..b291b377a 100644 --- a/android/src/main/java/com/oney/WebRTCModule/ScreenCaptureController.java +++ b/android/src/main/java/com/oney/WebRTCModule/ScreenCaptureController.java @@ -25,11 +25,9 @@ public class ScreenCaptureController extends AbstractVideoCaptureController { private final Context context; - public ScreenCaptureController(Context context, - int width, - int height, - Intent mediaProjectionPermissionResultData) { - super(width, height, DEFAULT_FPS); + public ScreenCaptureController( + Context context, int width, int height, Intent mediaProjectionPermissionResultData, float resolutionScale) { + super((int) (width * resolutionScale), (int) (height * resolutionScale), DEFAULT_FPS); this.mediaProjectionPermissionResultData = mediaProjectionPermissionResultData; @@ -39,19 +37,24 @@ public ScreenCaptureController(Context context, @Override public void onOrientationChanged(int orientation) { DisplayMetrics displayMetrics = DisplayUtils.getDisplayMetrics((Activity) context); - final int width = displayMetrics.widthPixels; - final int height = displayMetrics.heightPixels; - - // Pivot to the executor thread because videoCapturer.changeCaptureFormat runs in the main - // thread and may deadlock. - ThreadUtils.runOnExecutor(() -> { - try { - videoCapturer.changeCaptureFormat(width, height, DEFAULT_FPS); - } catch (Exception ex) { - // We ignore exceptions here. The video capturer runs on its own - // thread and we cannot synchronize with it. - } - }); + final int width = (int) (displayMetrics.widthPixels * resolutionScale); + final int height = (int) (displayMetrics.heightPixels * resolutionScale); + if (width != ScreenCaptureController.this.targetWidth + || height != ScreenCaptureController.this.targetHeight) { + ScreenCaptureController.this.targetWidth = width; + ScreenCaptureController.this.targetHeight = height; + + // Pivot to the executor thread because videoCapturer.changeCaptureFormat runs in the main + // thread and may deadlock. + ThreadUtils.runOnExecutor(() -> { + try { + videoCapturer.changeCaptureFormat(width, height, DEFAULT_FPS); + } catch (Exception ex) { + // We ignore exceptions here. The video capturer runs on its own + // thread and we cannot synchronize with it. + } + }); + } } }; diff --git a/android/src/main/java/com/oney/WebRTCModule/SerializeUtils.java b/android/src/main/java/com/oney/WebRTCModule/SerializeUtils.java index bf1ec2872..dbd8942a6 100644 --- a/android/src/main/java/com/oney/WebRTCModule/SerializeUtils.java +++ b/android/src/main/java/com/oney/WebRTCModule/SerializeUtils.java @@ -150,6 +150,9 @@ public static ReadableMap serializeRtpParameters(RtpParameters params) { if (encoding.maxBitrateBps != null) { encodingMap.putInt("maxBitrate", encoding.maxBitrateBps); } + if (encoding.minBitrateBps != null) { + encodingMap.putInt("minBitrate", encoding.minBitrateBps); + } if (encoding.maxFramerate != null) { encodingMap.putInt("maxFramerate", encoding.maxFramerate); } @@ -238,6 +241,7 @@ public static RtpParameters updateRtpParameters(ReadableMap updateParams, RtpPar RtpParameters.Encoding encoding = encodings.get(i); // Dealing with nullable Integers Integer maxBitrate = encodingUpdate.hasKey("maxBitrate") ? encodingUpdate.getInt("maxBitrate") : null; + Integer minBitrate = encodingUpdate.hasKey("minBitrate") ? encodingUpdate.getInt("minBitrate") : null; Integer maxFramerate = encodingUpdate.hasKey("maxFramerate") ? encodingUpdate.getInt("maxFramerate") : null; Double scaleResolutionDownBy = encodingUpdate.hasKey("scaleResolutionDownBy") ? encodingUpdate.getDouble("scaleResolutionDownBy") @@ -246,6 +250,7 @@ public static RtpParameters updateRtpParameters(ReadableMap updateParams, RtpPar encoding.active = encodingUpdate.getBoolean("active"); encoding.rid = encodingUpdate.getString("rid"); encoding.maxBitrateBps = maxBitrate; + encoding.minBitrateBps = minBitrate; encoding.maxFramerate = maxFramerate; encoding.scaleResolutionDownBy = scaleResolutionDownBy; } @@ -294,6 +299,9 @@ private static RtpParameters.Encoding parseEncoding(ReadableMap params) { if (params.hasKey("maxBitrate")) { encoding.maxBitrateBps = params.getInt("maxBitrate"); } + if (params.hasKey("minBitrate")) { + encoding.minBitrateBps = params.getInt("minBitrate"); + } if (params.hasKey("maxFramerate")) { encoding.maxFramerate = params.getInt("maxFramerate"); } diff --git a/android/src/main/java/com/oney/WebRTCModule/SpeechActivityDetector.java b/android/src/main/java/com/oney/WebRTCModule/SpeechActivityDetector.java index c621440dc..edefbbc7d 100644 --- a/android/src/main/java/com/oney/WebRTCModule/SpeechActivityDetector.java +++ b/android/src/main/java/com/oney/WebRTCModule/SpeechActivityDetector.java @@ -45,7 +45,6 @@ * the listener is responsible for dispatching to the JS thread. */ class SpeechActivityDetector { - interface Listener { void onSpeechStarted(); void onSpeechEnded(); diff --git a/android/src/main/java/com/oney/WebRTCModule/VideoTrackAdapter.java b/android/src/main/java/com/oney/WebRTCModule/VideoTrackAdapter.java index 946dfee55..343f70ab8 100644 --- a/android/src/main/java/com/oney/WebRTCModule/VideoTrackAdapter.java +++ b/android/src/main/java/com/oney/WebRTCModule/VideoTrackAdapter.java @@ -210,7 +210,9 @@ private void emitDimensionChangeEvent(int width, int height) { params.putInt("width", width); params.putInt("height", height); - Log.d(TAG, "Dimension change event pcId: " + peerConnectionId + " trackId: " + trackId + " dimensions: " + width + "x" + height); + Log.d(TAG, + "Dimension change event pcId: " + peerConnectionId + " trackId: " + trackId + + " dimensions: " + width + "x" + height); VideoTrackAdapter.this.webRTCModule.sendEvent("videoTrackDimensionChanged", params); } diff --git a/android/src/main/java/com/oney/WebRTCModule/WebRTCModule.java b/android/src/main/java/com/oney/WebRTCModule/WebRTCModule.java index d2cf9ea40..a31643897 100644 --- a/android/src/main/java/com/oney/WebRTCModule/WebRTCModule.java +++ b/android/src/main/java/com/oney/WebRTCModule/WebRTCModule.java @@ -28,6 +28,11 @@ import org.webrtc.audio.AudioDeviceModule; import org.webrtc.audio.JavaAudioDeviceModule; +import java.io.ByteArrayInputStream; +import java.nio.charset.StandardCharsets; +import java.security.MessageDigest; +import java.security.cert.CertificateFactory; +import java.security.cert.X509Certificate; import java.util.ArrayList; import java.util.Arrays; import java.util.HashMap; @@ -50,6 +55,9 @@ public class WebRTCModule extends ReactContextBaseJavaModule { private final SparseArray mPeerConnectionObservers; final Map localStreams; + // Store generated certificates by ID to avoid exposing private keys to JS + private static final Map mCertificates = new HashMap<>(); + private final GetUserMediaImpl getUserMediaImpl; private SpeechActivityDetector speechActivityDetector; @@ -69,7 +77,9 @@ public WebRTCModule(ReactApplicationContext reactContext) { String fieldTrials = options.fieldTrials; - PeerConnectionFactory.initialize(PeerConnectionFactory.InitializationOptions.builder(reactContext) + PeerConnectionFactory.initialize(PeerConnectionFactory.InitializationOptions + .builder(reactContext) + .setFieldTrials(fieldTrials) .setNativeLibraryLoader(new LibraryLoader()) .setInjectableLogger(injectableLogger, loggingSeverity) @@ -82,7 +92,8 @@ public WebRTCModule(ReactApplicationContext reactContext) { if (encoderFactory == null || decoderFactory == null) { // Initialize EGL context required for HW acceleration. EglBase.Context eglContext = EglUtils.getRootEglBaseContext(); - encoderFactory = new SimulcastAlignedVideoEncoderFactory(eglContext, true, true, ResolutionAdjustment.MULTIPLE_OF_16); + encoderFactory = new SimulcastAlignedVideoEncoderFactory( + eglContext, true, true, ResolutionAdjustment.MULTIPLE_OF_16); decoderFactory = new SelectiveVideoDecoderFactory(eglContext, false, Arrays.asList("VP9", "AV1")); } @@ -94,6 +105,8 @@ public WebRTCModule(ReactApplicationContext reactContext) { try { if (options.audioProcessingFactoryProvider != null) { audioProcessingFactory = options.audioProcessingFactoryProvider.getFactory(); + } else if (options.audioProcessingFactoryFactory != null) { + audioProcessingFactory = options.audioProcessingFactoryFactory.call(); } } catch (Exception e) { // do nothing. @@ -129,41 +142,43 @@ public void invalidate() { Log.d(TAG, "invalidate()"); try { - ThreadUtils.submitToExecutor(() -> { - // 1. Dispose PeerConnections (dispose() calls close() internally) - for (int i = 0; i < mPeerConnectionObservers.size(); i++) { - try { - mPeerConnectionObservers.valueAt(i).dispose(); - } catch (Exception e) { - Log.w(TAG, "invalidate(): error disposing PC " + mPeerConnectionObservers.keyAt(i), e); - } - } - mPeerConnectionObservers.clear(); - - // 2. Detach tracks, then dispose streams. Tracks themselves get disposed in step 3. - for (Map.Entry entry : localStreams.entrySet()) { - try { - MediaStream stream = entry.getValue(); - for (AudioTrack t : new ArrayList<>(stream.audioTracks)) stream.removeTrack(t); - for (VideoTrack t : new ArrayList<>(stream.videoTracks)) stream.removeTrack(t); - stream.dispose(); - } catch (Exception e) { - Log.w(TAG, "invalidate(): error disposing stream " + entry.getKey(), e); - } - } - localStreams.clear(); + ThreadUtils + .submitToExecutor(() -> { + // 1. Dispose PeerConnections (dispose() calls close() internally) + for (int i = 0; i < mPeerConnectionObservers.size(); i++) { + try { + mPeerConnectionObservers.valueAt(i).dispose(); + } catch (Exception e) { + Log.w(TAG, "invalidate(): error disposing PC " + mPeerConnectionObservers.keyAt(i), e); + } + } + mPeerConnectionObservers.clear(); + + // 2. Detach tracks, then dispose streams. Tracks themselves get disposed in step 3. + for (Map.Entry entry : localStreams.entrySet()) { + try { + MediaStream stream = entry.getValue(); + for (AudioTrack t : new ArrayList<>(stream.audioTracks)) stream.removeTrack(t); + for (VideoTrack t : new ArrayList<>(stream.videoTracks)) stream.removeTrack(t); + stream.dispose(); + } catch (Exception e) { + Log.w(TAG, "invalidate(): error disposing stream " + entry.getKey(), e); + } + } + localStreams.clear(); - // 3. Stop capturers + dispose tracks (prevents use-after-free on factory threads) - getUserMediaImpl.disposeAllTracks(); + // 3. Stop capturers + dispose tracks (prevents use-after-free on factory threads) + getUserMediaImpl.disposeAllTracks(); - // 4. Dispose factory (frees C++ factory + 3 threads) - if (mFactory != null) { - mFactory.dispose(); - mFactory = null; - } + // 4. Dispose factory (frees C++ factory + 3 threads) + if (mFactory != null) { + mFactory.dispose(); + mFactory = null; + } - return null; - }).get(); + return null; + }) + .get(); } catch (InterruptedException | ExecutionException e) { Log.e(TAG, "invalidate() error", e); } @@ -188,28 +203,28 @@ public void onSpeechEnded() { } }); - return JavaAudioDeviceModule - .builder(reactContext) + return JavaAudioDeviceModule.builder(reactContext) .setUseHardwareAcousticEchoCanceler(Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q) .setUseHardwareNoiseSuppressor(Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q) .setUseStereoOutput(true) - .setAudioBufferCallback((audioBuffer, audioFormat, channelCount, sampleRate, bytesRead, captureTimeNs) -> { - // 1. Speech activity detection on raw mic data, BEFORE any mutation. - speechActivityDetector.processBuffer(audioBuffer, bytesRead); - - // 2. Existing screen-audio mixing — mutates audioBuffer in place. - if (bytesRead > 0) { - WebRTCModuleOptions.ScreenAudioBytesProvider provider = - WebRTCModuleOptions.getInstance().screenAudioBytesProvider; - if (provider != null) { - java.nio.ByteBuffer screenBuffer = provider.getScreenAudioBytes(bytesRead); - if (screenBuffer != null && screenBuffer.remaining() > 0) { - mixScreenAudioIntoBuffer(audioBuffer, screenBuffer, bytesRead); + .setAudioBufferCallback( + (audioBuffer, audioFormat, channelCount, sampleRate, bytesRead, captureTimeNs) -> { + // 1. Speech activity detection on raw mic data, BEFORE any mutation. + speechActivityDetector.processBuffer(audioBuffer, bytesRead); + + // 2. Existing screen-audio mixing — mutates audioBuffer in place. + if (bytesRead > 0) { + WebRTCModuleOptions.ScreenAudioBytesProvider provider = + WebRTCModuleOptions.getInstance().screenAudioBytesProvider; + if (provider != null) { + java.nio.ByteBuffer screenBuffer = provider.getScreenAudioBytes(bytesRead); + if (screenBuffer != null && screenBuffer.remaining() > 0) { + mixScreenAudioIntoBuffer(audioBuffer, screenBuffer, bytesRead); + } + } } - } - } - return captureTimeNs; - }) + return captureTimeNs; + }) .setAudioRecordStateCallback(new JavaAudioDeviceModule.AudioRecordStateCallback() { @Override public void onWebRtcAudioRecordStart() { @@ -230,9 +245,8 @@ public void onWebRtcAudioRecordStop() { * within its own bounds. When one buffer is shorter, the other's samples pass * through unmodified (mic samples stay as-is, or screen-only samples are written). */ - private static void mixScreenAudioIntoBuffer(java.nio.ByteBuffer micBuffer, - java.nio.ByteBuffer screenBuffer, - int bytesRead) { + private static void mixScreenAudioIntoBuffer( + java.nio.ByteBuffer micBuffer, java.nio.ByteBuffer screenBuffer, int bytesRead) { micBuffer.position(0); screenBuffer.position(0); @@ -419,7 +433,24 @@ private PeerConnection.RTCConfiguration parseRTCConfiguration(ReadableMap map) { } // FIXME: peerIdentity of type DOMString (public api) - // FIXME: certificates of type sequence (public api) + + // certificates (public api) + if (map.hasKey("certificates") && map.getType("certificates") == ReadableType.Array) { + ReadableArray certificates = map.getArray("certificates"); + if (certificates.size() > 0) { + ReadableMap certMap = certificates.getMap(0); + if (certMap.hasKey("certificateId")) { + String certId = certMap.getString("certificateId"); + RtcCertificatePem cert; + synchronized (mCertificates) { + cert = mCertificates.get(certId); + } + if (cert != null) { + conf.certificate = cert; + } + } + } + } // iceCandidatePoolSize of type unsigned short, defaulting to 0 if (map.hasKey("iceCandidatePoolSize") && map.getType("iceCandidatePoolSize") == ReadableType.Number) { @@ -560,33 +591,23 @@ public boolean peerConnectionInit(ReadableMap configuration, int id) { } } + // Must be called in the executor. public MediaStream getStreamForReactTag(String streamReactTag) { - // This function _only_ gets called from WebRTCView, in the UI thread. - // Hence make sure we run this code in the executor or we run at the risk - // of being out of sync. - try { - return (MediaStream) ThreadUtils - .submitToExecutor((Callable) () -> { - MediaStream stream = localStreams.get(streamReactTag); - - if (stream != null) { - return stream; - } + MediaStream stream = localStreams.get(streamReactTag); - for (int i = 0, size = mPeerConnectionObservers.size(); i < size; i++) { - PeerConnectionObserver pco = mPeerConnectionObservers.valueAt(i); - stream = pco.remoteStreams.get(streamReactTag); - if (stream != null) { - return stream; - } - } + if (stream != null) { + return stream; + } - return null; - }) - .get(); - } catch (ExecutionException | InterruptedException e) { - return null; + for (int i = 0, size = mPeerConnectionObservers.size(); i < size; i++) { + PeerConnectionObserver pco = mPeerConnectionObservers.valueAt(i); + stream = pco.remoteStreams.get(streamReactTag); + if (stream != null) { + return stream; + } } + + return null; } public MediaStreamTrack getTrack(int pcId, String trackId) { @@ -611,6 +632,15 @@ public VideoTrack createVideoTrack(AbstractVideoCaptureController videoCaptureCo return getUserMediaImpl.createVideoTrack(videoCaptureController); } + public void registerTrack(AudioTrack track, AudioSource source) { + getUserMediaImpl.registerTrack(track, source); + } + + public void registerTrack(VideoTrack track, VideoSource source, AbstractVideoCaptureController controller, + SurfaceTextureHelper surfaceTextureHelper) { + getUserMediaImpl.registerTrack(track, source, controller, surfaceTextureHelper); + } + public void createStream( MediaStreamTrack[] tracks, GetUserMediaImpl.BiConsumer> successCallback) { getUserMediaImpl.createStream(tracks, successCallback); @@ -660,8 +690,9 @@ public WritableMap peerConnectionAddTransceiver(int id, ReadableMap options) { MediaStreamTrack track = getLocalTrack(trackId); transceiver = pco.addTransceiver( track, SerializeUtils.parseTransceiverOptions(options.getMap("init"))); - - // Add mute detection for local video tracks (dimension detection is handled at track creation) + + // Add mute detection for local video tracks (dimension detection is handled at track + // creation) if (track instanceof VideoTrack) { pco.videoTrackAdapters.addAdapter((VideoTrack) track); } @@ -717,7 +748,7 @@ public WritableMap peerConnectionAddTrack(int id, String trackId, ReadableMap op } } RtpSender sender = pco.getPeerConnection().addTrack(track, streamIds); - + // Add mute detection for local video tracks (dimension detection is handled at track creation) if (track instanceof VideoTrack) { pco.videoTrackAdapters.addAdapter((VideoTrack) track); @@ -951,8 +982,8 @@ public boolean transceiverSetCodecPreferences(int id, String senderId, ReadableA } @ReactMethod - public void getDisplayMedia(Promise promise) { - ThreadUtils.runOnExecutor(() -> getUserMediaImpl.getDisplayMedia(promise)); + public void getDisplayMedia(ReadableMap constraints, Promise promise) { + ThreadUtils.runOnExecutor(() -> getUserMediaImpl.getDisplayMedia(constraints, promise)); } @ReactMethod @@ -1142,12 +1173,9 @@ private ReadableArray getTransceiversInfo(PeerConnection peerConnection) { return transceiverUpdates; } - @ReactMethod public void mediaStreamTrackSetVideoEffects(String id, ReadableArray names) { - ThreadUtils.runOnExecutor(() -> { - getUserMediaImpl.setVideoEffects(id, names); - }); + ThreadUtils.runOnExecutor(() -> { getUserMediaImpl.setVideoEffects(id, names); }); } @ReactMethod @@ -1483,10 +1511,13 @@ public void peerConnectionAddICECandidate(int pcId, ReadableMap candidateMap, Pr return; } - IceCandidate candidate = new IceCandidate( - candidateMap.hasKey("sdpMid") && !candidateMap.isNull("sdpMid") ? candidateMap.getString("sdpMid") : "", - candidateMap.hasKey("sdpMLineIndex") && !candidateMap.isNull("sdpMLineIndex") ? candidateMap.getInt("sdpMLineIndex") : 0, - candidateMap.getString("candidate")); + IceCandidate candidate = new IceCandidate(candidateMap.hasKey("sdpMid") && !candidateMap.isNull("sdpMid") + ? candidateMap.getString("sdpMid") + : "", + candidateMap.hasKey("sdpMLineIndex") && !candidateMap.isNull("sdpMLineIndex") + ? candidateMap.getInt("sdpMLineIndex") + : 0, + candidateMap.getString("candidate")); peerConnection.addIceCandidate(candidate, new AddIceObserver() { @Override @@ -1620,6 +1651,177 @@ public void dataChannelSend(int peerConnectionId, String reactTag, String data, }); } + // Frame Cryptor methods + //////////////////////////////// + RTCCryptoManager frameCryptor = new RTCCryptoManager(this); + + @ReactMethod(isBlockingSynchronousMethod = true) + public String frameCryptorFactoryCreateFrameCryptor(ReadableMap config) { + return frameCryptor.frameCryptorFactoryCreateFrameCryptor(config); + } + + @ReactMethod + public void frameCryptorSetKeyIndex(ReadableMap config, Promise promise) { + frameCryptor.frameCryptorSetKeyIndex(config, promise); + } + + @ReactMethod + public void frameCryptorGetKeyIndex(ReadableMap config, Promise promise) { + frameCryptor.frameCryptorGetKeyIndex(config, promise); + } + + @ReactMethod + public void frameCryptorSetEnabled(ReadableMap config, Promise promise) { + frameCryptor.frameCryptorSetEnabled(config, promise); + } + + @ReactMethod + public void frameCryptorGetEnabled(ReadableMap config, Promise promise) { + frameCryptor.frameCryptorGetEnabled(config, promise); + } + + @ReactMethod + public void frameCryptorDispose(ReadableMap config, Promise promise) { + frameCryptor.frameCryptorDispose(config, promise); + } + + @ReactMethod(isBlockingSynchronousMethod = true) + public String frameCryptorFactoryCreateKeyProvider(ReadableMap config) { + return frameCryptor.frameCryptorFactoryCreateKeyProvider(config); + } + + @ReactMethod + public void keyProviderSetSharedKey(ReadableMap config, Promise promise) { + frameCryptor.keyProviderSetSharedKey(config, promise); + } + + @ReactMethod + public void keyProviderRatchetSharedKey(ReadableMap config, Promise promise) { + frameCryptor.keyProviderRatchetSharedKey(config, promise); + } + + @ReactMethod + public void keyProviderExportSharedKey(ReadableMap config, Promise promise) { + frameCryptor.keyProviderExportSharedKey(config, promise); + } + + @ReactMethod + public void keyProviderSetKey(ReadableMap config, Promise promise) { + frameCryptor.keyProviderSetKey(config, promise); + } + + @ReactMethod + public void keyProviderRatchetKey(ReadableMap config, Promise promise) { + frameCryptor.keyProviderRatchetKey(config, promise); + } + + @ReactMethod + public void keyProviderExportKey(ReadableMap config, Promise promise) { + frameCryptor.keyProviderExportKey(config, promise); + } + + @ReactMethod + public void keyProviderSetSifTrailer(ReadableMap config, Promise promise) { + frameCryptor.keyProviderSetSifTrailer(config, promise); + } + + @ReactMethod + public void keyProviderDispose(ReadableMap config, Promise promise) { + frameCryptor.keyProviderDispose(config, promise); + } + + @ReactMethod + public void generateCertificate(ReadableMap options, Promise promise) { + ThreadUtils.runOnExecutor(() -> { + try { + PeerConnection.KeyType keyType = PeerConnection.KeyType.ECDSA; + long expires = 2592000L; // Default 30 days + + if (options.hasKey("keyType")) { + String keyTypeStr = options.getString("keyType"); + if ("RSA".equals(keyTypeStr)) { + keyType = PeerConnection.KeyType.RSA; + } else if ("ECDSA".equals(keyTypeStr)) { + keyType = PeerConnection.KeyType.ECDSA; + } + } + + if (options.hasKey("expires")) { + expires = (long) options.getDouble("expires"); + } + + RtcCertificatePem cert = RtcCertificatePem.generateCertificate(keyType, expires); + String certId = java.util.UUID.randomUUID().toString(); + synchronized (mCertificates) { + mCertificates.put(certId, cert); + } + + WritableMap params = Arguments.createMap(); + params.putString("certificateId", certId); + // Return expires as millis since epoch + params.putDouble("expires", System.currentTimeMillis() + expires * 1000); + + // Calculate fingerprints + WritableArray fingerprints = Arguments.createArray(); + + try { + CertificateFactory cf = CertificateFactory.getInstance("X.509"); + ByteArrayInputStream is = + new ByteArrayInputStream(cert.certificate.getBytes(StandardCharsets.UTF_8)); + X509Certificate x509Cert = (X509Certificate) cf.generateCertificate(is); + + MessageDigest digest = MessageDigest.getInstance("SHA-256"); + byte[] hash = digest.digest(x509Cert.getEncoded()); + + WritableMap fingerprint = Arguments.createMap(); + fingerprint.putString("algorithm", "sha-256"); + fingerprint.putString("value", bytesToHex(hash)); + fingerprints.pushMap(fingerprint); + } catch (Exception e) { + Log.e(TAG, "Failed to calculate fingerprint: " + e.getMessage()); + } + + params.putArray("fingerprints", fingerprints); + + promise.resolve(params); + } catch (Exception e) { + promise.reject(e); + } + }); + } + + private String bytesToHex(byte[] bytes) { + StringBuilder sb = new StringBuilder(); + for (byte b : bytes) { + sb.append(String.format("%02x", b)); + sb.append(":"); + } + if (sb.length() > 0) { + sb.setLength(sb.length() - 1); + } + return sb.toString(); + } + + @ReactMethod + public void dataPacketCryptorFactoryCreateDataPacketCryptor(ReadableMap params, @NonNull Promise result) { + frameCryptor.dataPacketCryptorFactoryCreateDataPacketCryptor(params, result); + } + + @ReactMethod + public void dataPacketCryptorEncrypt(ReadableMap params, @NonNull Promise result) { + frameCryptor.dataPacketCryptorEncrypt(params, result); + } + + @ReactMethod + public void dataPacketCryptorDecrypt(ReadableMap params, @NonNull Promise result) { + frameCryptor.dataPacketCryptorDecrypt(params, result); + } + + @ReactMethod + public void dataPacketCryptorDispose(ReadableMap params, @NonNull Promise result) { + frameCryptor.dataPacketCryptorDispose(params, result); + } + @ReactMethod public void addListener(String eventName) { // Keep: Required for RN built in Event Emitter Calls. diff --git a/android/src/main/java/com/oney/WebRTCModule/WebRTCModuleOptions.java b/android/src/main/java/com/oney/WebRTCModule/WebRTCModuleOptions.java index 24e53f8ce..0795330a8 100644 --- a/android/src/main/java/com/oney/WebRTCModule/WebRTCModuleOptions.java +++ b/android/src/main/java/com/oney/WebRTCModule/WebRTCModuleOptions.java @@ -2,6 +2,7 @@ import com.oney.WebRTCModule.audio.AudioProcessingFactoryProvider; +import org.webrtc.AudioProcessingFactory; import org.webrtc.Loggable; import org.webrtc.Logging; import org.webrtc.VideoDecoderFactory; @@ -9,6 +10,7 @@ import org.webrtc.audio.AudioDeviceModule; import java.nio.ByteBuffer; +import java.util.concurrent.Callable; public class WebRTCModuleOptions { private static WebRTCModuleOptions instance; @@ -16,11 +18,14 @@ public class WebRTCModuleOptions { public VideoEncoderFactory videoEncoderFactory; public VideoDecoderFactory videoDecoderFactory; public AudioDeviceModule audioDeviceModule; + public Callable audioProcessingFactoryFactory; + public Loggable injectableLogger; public Logging.Severity loggingSeverity; public String fieldTrials; public boolean enableMediaProjectionService; public AudioProcessingFactoryProvider audioProcessingFactoryProvider; + public double defaultTrackVolume = 1.0; /** * Provider for screen share audio bytes. When set, the AudioDeviceModule's diff --git a/android/src/main/java/com/oney/WebRTCModule/WebRTCView.java b/android/src/main/java/com/oney/WebRTCModule/WebRTCView.java index 95ede811e..91b12732f 100644 --- a/android/src/main/java/com/oney/WebRTCModule/WebRTCView.java +++ b/android/src/main/java/com/oney/WebRTCModule/WebRTCView.java @@ -163,28 +163,53 @@ private void cleanSurfaceViewRenderer() { surfaceViewRenderer.clearImage(); } - private VideoTrack getVideoTrackForStreamURL(String streamURL) { - VideoTrack videoTrack = null; + /** + * Asynchronously retrieves the VideoTrack for the given streamURL. + * This method avoids blocking the UI thread by performing the lookup + * on the WebRTC executor thread and posting the result back to the UI thread. + * + * @param streamURL The stream URL to lookup + * @param callback Callback invoked on UI thread with the VideoTrack (or null if not found) + */ + private void getVideoTrackForStreamURL(String streamURL, java.util.function.Consumer callback) { + if (streamURL == null) { + callback.accept(null); + return; + } - if (streamURL != null) { - ReactContext reactContext = (ReactContext) getContext(); - WebRTCModule module = reactContext.getNativeModule(WebRTCModule.class); - MediaStream stream = module.getStreamForReactTag(streamURL); + ReactContext reactContext = (ReactContext) getContext(); + WebRTCModule module = reactContext.getNativeModule(WebRTCModule.class); - if (stream != null) { - List videoTracks = stream.videoTracks; + // Submit lookup to executor thread to avoid blocking UI thread + ThreadUtils.runOnExecutor(() -> { + try { + MediaStream stream = module.getStreamForReactTag(streamURL); + if (stream == null) { + Log.w(TAG, "Stream not found for URL: " + streamURL); + post(() -> callback.accept(null)); + return; + } + VideoTrack videoTrack = null; + List videoTracks = stream.videoTracks; if (!videoTracks.isEmpty()) { videoTrack = videoTracks.get(0); } - } - if (videoTrack == null) { - Log.w(TAG, "No video stream for react tag: " + streamURL); - } - } + if (videoTrack == null) { + Log.w(TAG, "No video stream for react tag: " + streamURL); + post(() -> callback.accept(null)); + return; + } - return videoTrack; + // Post result back to UI thread + final VideoTrack result = videoTrack; + post(() -> callback.accept(result)); + } catch (Throwable tr) { + Log.e(TAG, "Error getting video track for stream URL: " + streamURL, tr); + post(() -> callback.accept(null)); + } + }); } @Override @@ -424,20 +449,23 @@ private void setScalingType(ScalingType scalingType) { * this {@code WebRTCView} or {@code null}. */ void setStreamURL(String streamURL) { - // Is the value of this.streamURL really changing? - if (!Objects.equals(streamURL, this.streamURL)) { - // XXX The value of this.streamURL is really changing. Before - // realizing/applying the change, let go of the old videoTrack. Of - // course, that is only necessary if the value of videoTrack will - // really change. Please note though that letting go of the old - // videoTrack before assigning to this.streamURL is vital; - // otherwise, removeRendererFromVideoTrack will fail to remove the - // old videoTrack from the associated videoRenderer, two - // VideoTracks (the old and the new) may start rendering and, most - // importantly the videoRender may eventually crash when the old - // videoTrack is disposed. - VideoTrack videoTrack = getVideoTrackForStreamURL(streamURL); + Log.d(TAG, "Set stream URL " + streamURL + " current: " + this.streamURL); + if (Objects.equals(streamURL, this.streamURL)) { + return; + } + // The value of this.streamURL is really changing. Before + // realizing/applying the change, let go of the old videoTrack. Of + // course, that is only necessary if the value of videoTrack will + // really change. Please note though that letting go of the old + // videoTrack before assigning to this.streamURL is vital; + // otherwise, removeRendererFromVideoTrack will fail to remove the + // old videoTrack from the associated videoRenderer, two + // VideoTracks (the old and the new) may start rendering and, most + // importantly the videoRender may eventually crash when the old + // videoTrack is disposed. + getVideoTrackForStreamURL(streamURL, videoTrack -> { + Log.d(TAG, "Got video track for stream URL " + streamURL + " -> " + videoTrack); if (this.videoTrack != videoTrack) { setVideoTrack(null); } @@ -447,7 +475,7 @@ void setStreamURL(String streamURL) { // After realizing/applying the change in the value of // this.streamURL, reflect it on the value of videoTrack. setVideoTrack(videoTrack); - } + }); } /** diff --git a/android/src/main/java/com/oney/WebRTCModule/audio/AudioProcessingController.java b/android/src/main/java/com/oney/WebRTCModule/audio/AudioProcessingController.java index 9444eb781..26b1abd62 100644 --- a/android/src/main/java/com/oney/WebRTCModule/audio/AudioProcessingController.java +++ b/android/src/main/java/com/oney/WebRTCModule/audio/AudioProcessingController.java @@ -5,12 +5,13 @@ public class AudioProcessingController implements AudioProcessingFactoryProvider { /** - * This is the audio processing module that will be applied to the audio stream after it is captured from the microphone. - * This is useful for adding echo cancellation, noise suppression, etc. + * This is the audio processing module that will be applied to the audio stream after it is captured from the + * microphone. This is useful for adding echo cancellation, noise suppression, etc. */ public final AudioProcessingAdapter capturePostProcessing = new AudioProcessingAdapter(); /** - * This is the audio processing module that will be applied to the audio stream before it is rendered to the speaker. + * This is the audio processing module that will be applied to the audio stream before it is rendered to the + * speaker. */ public final AudioProcessingAdapter renderPreProcessing = new AudioProcessingAdapter(); diff --git a/android/src/main/java/com/oney/WebRTCModule/webrtcutils/WrappedVideoDecoderFactory.java b/android/src/main/java/com/oney/WebRTCModule/webrtcutils/WrappedVideoDecoderFactory.java index c94910d5a..22f2503d7 100644 --- a/android/src/main/java/com/oney/WebRTCModule/webrtcutils/WrappedVideoDecoderFactory.java +++ b/android/src/main/java/com/oney/WebRTCModule/webrtcutils/WrappedVideoDecoderFactory.java @@ -19,23 +19,25 @@ import androidx.annotation.Nullable; import org.webrtc.*; + import java.util.Arrays; import java.util.LinkedHashSet; /** - * A patch on top of https://github.com/GetStream/webrtc/blob/main/sdk/android/api/org/webrtc/WrappedVideoDecoderFactory.java - * It disables direct-to-SurfaceTextureFrame rendering for c2 exynos/qualcomm/mediatek hardware decoder + * A patch on top of + * https://github.com/GetStream/webrtc/blob/main/sdk/android/api/org/webrtc/WrappedVideoDecoderFactory.java It disables + * direct-to-SurfaceTextureFrame rendering for c2 exynos/qualcomm/mediatek hardware decoder */ public class WrappedVideoDecoderFactory implements VideoDecoderFactory { // Known hardware decoders to have failures when it outputs to a SurfaceTexture directly private static final String[] DECODER_DENYLIST_PREFIXES = { - "OMX.qcom.", - "OMX.hisi.", + "OMX.qcom.", "OMX.hisi.", // https://github.com/androidx/media/issues/2003 -// "c2.exynos.", -// "c2.qti.", -// // https://github.com/androidx/media/blob/bfe5930f7f29c6492d60e3d01a90abd3c138b615/libraries/exoplayer/src/main/java/androidx/media3/exoplayer/video/MediaCodecVideoRenderer.java#L1499 -// "c2.mtk.", + // "c2.exynos.", + // "c2.qti.", + // // + // https://github.com/androidx/media/blob/bfe5930f7f29c6492d60e3d01a90abd3c138b615/libraries/exoplayer/src/main/java/androidx/media3/exoplayer/video/MediaCodecVideoRenderer.java#L1499 + // "c2.mtk.", }; private final boolean forceSWCodec; @@ -47,7 +49,8 @@ public WrappedVideoDecoderFactory(@Nullable EglBase.Context eglContext, boolean } private final VideoDecoderFactory hardwareVideoDecoderFactory; - private final VideoDecoderFactory hardwareVideoDecoderFactoryWithoutEglContext = new HardwareVideoDecoderFactory(null) ; + private final VideoDecoderFactory hardwareVideoDecoderFactoryWithoutEglContext = + new HardwareVideoDecoderFactory(null); private final VideoDecoderFactory softwareVideoDecoderFactory = new SoftwareVideoDecoderFactory(); @Nullable private final VideoDecoderFactory platformSoftwareVideoDecoderFactory; @@ -62,7 +65,7 @@ public VideoDecoder createDecoder(VideoCodecInfo codecType) { if (softwareDecoder == null && this.platformSoftwareVideoDecoderFactory != null) { softwareDecoder = this.platformSoftwareVideoDecoderFactory.createDecoder(codecType); } - if(hardwareDecoder != null && disableSurfaceTextureFrame(hardwareDecoder.getImplementationName())) { + if (hardwareDecoder != null && disableSurfaceTextureFrame(hardwareDecoder.getImplementationName())) { hardwareDecoder.release(); hardwareDecoder = this.hardwareVideoDecoderFactoryWithoutEglContext.createDecoder(codecType); } diff --git a/android/src/main/java/org/webrtc/Camera1Helper.java b/android/src/main/java/org/webrtc/Camera1Helper.java index f0dec0d8d..3a43ac540 100644 --- a/android/src/main/java/org/webrtc/Camera1Helper.java +++ b/android/src/main/java/org/webrtc/Camera1Helper.java @@ -30,7 +30,6 @@ */ public class Camera1Helper { - public static int getCameraId(String deviceName) { return Camera1Enumerator.getCameraIndex(deviceName); } diff --git a/android/src/main/java/org/webrtc/Camera2Helper.java b/android/src/main/java/org/webrtc/Camera2Helper.java index eab20edb2..e9189a47f 100644 --- a/android/src/main/java/org/webrtc/Camera2Helper.java +++ b/android/src/main/java/org/webrtc/Camera2Helper.java @@ -30,13 +30,14 @@ * by [CameraManager.getCameraIdList]. */ public class Camera2Helper { - @Nullable - public static List getSupportedFormats(CameraManager cameraManager, @Nullable String cameraId) { + public static List getSupportedFormats( + CameraManager cameraManager, @Nullable String cameraId) { return Camera2Enumerator.getSupportedFormats(cameraManager, cameraId); } - public static Size findClosestCaptureFormat(CameraManager cameraManager, @Nullable String cameraId, int width, int height) { + public static Size findClosestCaptureFormat( + CameraManager cameraManager, @Nullable String cameraId, int width, int height) { List formats = getSupportedFormats(cameraManager, cameraId); List sizes = new ArrayList<>(); diff --git a/examples/GumTestApp/ios/GumTestApp.xcodeproj/project.pbxproj b/examples/GumTestApp/ios/GumTestApp.xcodeproj/project.pbxproj index e5ba6d775..1e9bc8bcc 100644 --- a/examples/GumTestApp/ios/GumTestApp.xcodeproj/project.pbxproj +++ b/examples/GumTestApp/ios/GumTestApp.xcodeproj/project.pbxproj @@ -759,9 +759,7 @@ ); MTL_ENABLE_DEBUG_INFO = YES; ONLY_ACTIVE_ARCH = YES; - OTHER_LDFLAGS = ( - "$(inherited)", - ); + OTHER_LDFLAGS = "$(inherited)"; REACT_NATIVE_PATH = "${PODS_ROOT}/../../node_modules/react-native"; SDKROOT = iphoneos; SWIFT_ACTIVE_COMPILATION_CONDITIONS = "$(inherited) DEBUG"; @@ -818,9 +816,7 @@ "\"$(inherited)\"", ); MTL_ENABLE_DEBUG_INFO = NO; - OTHER_LDFLAGS = ( - "$(inherited)", - ); + OTHER_LDFLAGS = "$(inherited)"; REACT_NATIVE_PATH = "${PODS_ROOT}/../../node_modules/react-native"; SDKROOT = iphoneos; USE_HERMES = true; diff --git a/ios/RCTWebRTC/AudioDeviceModuleObserver.m b/ios/RCTWebRTC/AudioDeviceModuleObserver.m index 53318726d..08bc9fa52 100644 --- a/ios/RCTWebRTC/AudioDeviceModuleObserver.m +++ b/ios/RCTWebRTC/AudioDeviceModuleObserver.m @@ -43,7 +43,7 @@ - (NSInteger)audioDeviceModule:(RTCAudioDeviceModule *)audioDeviceModule didCrea [self.module sendEventWithName:kEventAudioDeviceModuleEngineCreated body:@{}]; } - return 0; // Success + return 0; // Success } - (NSInteger)audioDeviceModule:(RTCAudioDeviceModule *)audioDeviceModule @@ -62,7 +62,7 @@ - (NSInteger)audioDeviceModule:(RTCAudioDeviceModule *)audioDeviceModule }]; } - return 0; // Success + return 0; // Success } - (NSInteger)audioDeviceModule:(RTCAudioDeviceModule *)audioDeviceModule @@ -81,7 +81,7 @@ - (NSInteger)audioDeviceModule:(RTCAudioDeviceModule *)audioDeviceModule }]; } - return 0; // Success + return 0; // Success } - (NSInteger)audioDeviceModule:(RTCAudioDeviceModule *)audioDeviceModule @@ -100,7 +100,7 @@ - (NSInteger)audioDeviceModule:(RTCAudioDeviceModule *)audioDeviceModule }]; } - return 0; // Success + return 0; // Success } - (NSInteger)audioDeviceModule:(RTCAudioDeviceModule *)audioDeviceModule @@ -119,7 +119,7 @@ - (NSInteger)audioDeviceModule:(RTCAudioDeviceModule *)audioDeviceModule }]; } - return 0; // Success + return 0; // Success } - (NSInteger)audioDeviceModule:(RTCAudioDeviceModule *)audioDeviceModule willReleaseEngine:(AVAudioEngine *)engine { @@ -129,7 +129,7 @@ - (NSInteger)audioDeviceModule:(RTCAudioDeviceModule *)audioDeviceModule willRel [self.module sendEventWithName:kEventAudioDeviceModuleEngineWillRelease body:@{}]; } - return 0; // Success + return 0; // Success } - (NSInteger)audioDeviceModule:(RTCAudioDeviceModule *)audioDeviceModule @@ -172,7 +172,8 @@ - (void)audioDeviceModule:(RTCAudioDeviceModule *)audioDeviceModule }]; } - RCTLog(@"[AudioDeviceModuleObserver] Audio processing state updated - VP enabled: %d, VP bypassed: %d, AGC enabled: %d, stereo: %d", + RCTLog(@"[AudioDeviceModuleObserver] Audio processing state updated - VP enabled: %d, VP bypassed: %d, AGC " + @"enabled: %d, stereo: %d", state.voiceProcessingEnabled, state.voiceProcessingBypassed, state.voiceProcessingAGCEnabled, diff --git a/ios/RCTWebRTC/CaptureController.h b/ios/RCTWebRTC/CaptureController.h index ff16496f5..fc7a097bd 100644 --- a/ios/RCTWebRTC/CaptureController.h +++ b/ios/RCTWebRTC/CaptureController.h @@ -6,7 +6,7 @@ NS_ASSUME_NONNULL_BEGIN @interface CaptureController : NSObject @property(nonatomic, strong) id eventsDelegate; -@property(nonatomic, copy, nullable) NSString* deviceId; +@property(nonatomic, copy, nullable) NSString *deviceId; - (void)startCapture; - (void)stopCapture; diff --git a/ios/RCTWebRTC/CaptureController.m b/ios/RCTWebRTC/CaptureController.m index dc94d7da5..86a964ae8 100644 --- a/ios/RCTWebRTC/CaptureController.m +++ b/ios/RCTWebRTC/CaptureController.m @@ -12,17 +12,16 @@ - (void)stopCapture { // subclasses needs to override } -- (NSDictionary *) getSettings { +- (NSDictionary *)getSettings { // subclasses needs to override - return @{ - @"deviceId": self.deviceId - }; + return @{@"deviceId" : self.deviceId}; } - (void)applyConstraints:(NSDictionary *)constraints error:(NSError **)outError { - *outError = [NSError errorWithDomain:@"react-native-webrtc" - code:0 - userInfo:@{ NSLocalizedDescriptionKey: @"This video track does not support applyConstraints."}]; + *outError = + [NSError errorWithDomain:@"react-native-webrtc" + code:0 + userInfo:@{NSLocalizedDescriptionKey : @"This video track does not support applyConstraints."}]; } @end diff --git a/ios/RCTWebRTC/InAppScreenCaptureController.m b/ios/RCTWebRTC/InAppScreenCaptureController.m index 1b9561d39..525bd9768 100644 --- a/ios/RCTWebRTC/InAppScreenCaptureController.m +++ b/ios/RCTWebRTC/InAppScreenCaptureController.m @@ -3,7 +3,7 @@ #import "InAppScreenCaptureController.h" #import "InAppScreenCapturer.h" -@interface InAppScreenCaptureController () +@interface InAppScreenCaptureController () @end @implementation InAppScreenCaptureController @@ -31,11 +31,7 @@ - (void)stopCapture { } - (NSDictionary *)getSettings { - return @{ - @"deviceId": self.deviceId ?: @"in-app-screen-capture", - @"groupId": @"", - @"frameRate": @(30) - }; + return @{@"deviceId" : self.deviceId ?: @"in-app-screen-capture", @"groupId" : @"", @"frameRate" : @(30)}; } #pragma mark - CapturerEventsDelegate diff --git a/ios/RCTWebRTC/InAppScreenCapturer.m b/ios/RCTWebRTC/InAppScreenCapturer.m index 893761ae9..ad81964f3 100644 --- a/ios/RCTWebRTC/InAppScreenCapturer.m +++ b/ios/RCTWebRTC/InAppScreenCapturer.m @@ -28,46 +28,48 @@ - (void)startCapture { - (void)startRPScreenRecorder { RPScreenRecorder *recorder = [RPScreenRecorder sharedRecorder]; - recorder.microphoneEnabled = NO; // WebRTC handles mic input + recorder.microphoneEnabled = NO; // WebRTC handles mic input __weak __typeof__(self) weakSelf = self; - [recorder startCaptureWithHandler:^(CMSampleBufferRef _Nonnull sampleBuffer, - RPSampleBufferType bufferType, - NSError * _Nullable error) { - __strong __typeof__(weakSelf) strongSelf = weakSelf; - if (!strongSelf || error || !strongSelf->_capturing) { - return; - } + [recorder + startCaptureWithHandler:^( + CMSampleBufferRef _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError *_Nullable error) { + __strong __typeof__(weakSelf) strongSelf = weakSelf; + if (!strongSelf || error || !strongSelf->_capturing) { + return; + } - switch (bufferType) { - case RPSampleBufferTypeVideo: - [strongSelf processVideoSampleBuffer:sampleBuffer]; - break; - case RPSampleBufferTypeAudioApp: - if (strongSelf.audioBufferHandler) { - strongSelf.audioBufferHandler(sampleBuffer); - } - break; - case RPSampleBufferTypeAudioMic: - // Ignored — WebRTC handles mic capture via AudioDeviceModule - break; + switch (bufferType) { + case RPSampleBufferTypeVideo: + [strongSelf processVideoSampleBuffer:sampleBuffer]; + break; + case RPSampleBufferTypeAudioApp: + if (strongSelf.audioBufferHandler) { + strongSelf.audioBufferHandler(sampleBuffer); + } + break; + case RPSampleBufferTypeAudioMic: + // Ignored — WebRTC handles mic capture via AudioDeviceModule + break; + } } - } completionHandler:^(NSError * _Nullable error) { - __strong __typeof__(weakSelf) strongSelf = weakSelf; - if (!strongSelf) return; + completionHandler:^(NSError *_Nullable error) { + __strong __typeof__(weakSelf) strongSelf = weakSelf; + if (!strongSelf) + return; - if (error) { - NSLog(@"[InAppScreenCapturer] startCapture failed: %@", error.localizedDescription); - strongSelf->_capturing = NO; - [strongSelf.eventsDelegate capturerDidEnd:strongSelf]; - return; - } + if (error) { + NSLog(@"[InAppScreenCapturer] startCapture failed: %@", error.localizedDescription); + strongSelf->_capturing = NO; + [strongSelf.eventsDelegate capturerDidEnd:strongSelf]; + return; + } - // Capture started successfully — register for app lifecycle events. - // Done here (not in startCapture) so the RPScreenRecorder permission - // dialog doesn't trigger appWillResignActive before capture begins. - [strongSelf registerAppStateObservers]; - }]; + // Capture started successfully — register for app lifecycle events. + // Done here (not in startCapture) so the RPScreenRecorder permission + // dialog doesn't trigger appWillResignActive before capture begins. + [strongSelf registerAppStateObservers]; + }]; } - (void)processVideoSampleBuffer:(CMSampleBufferRef)sampleBuffer { @@ -76,13 +78,13 @@ - (void)processVideoSampleBuffer:(CMSampleBufferRef)sampleBuffer { return; } - int64_t timeStampNs = (int64_t)(CMTimeGetSeconds( - CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) * NSEC_PER_SEC); + int64_t timeStampNs = + (int64_t)(CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) * NSEC_PER_SEC); RTCCVPixelBuffer *rtcPixelBuffer = [[RTCCVPixelBuffer alloc] initWithPixelBuffer:pixelBuffer]; RTCVideoFrame *videoFrame = [[RTCVideoFrame alloc] initWithBuffer:rtcPixelBuffer - rotation:RTCVideoRotation_0 - timeStampNs:timeStampNs]; + rotation:RTCVideoRotation_0 + timeStampNs:timeStampNs]; [self.delegate capturer:self didCaptureVideoFrame:videoFrame]; } @@ -97,7 +99,7 @@ - (void)stopCapture { [self unregisterAppStateObservers]; - [[RPScreenRecorder sharedRecorder] stopCaptureWithHandler:^(NSError * _Nullable error) { + [[RPScreenRecorder sharedRecorder] stopCaptureWithHandler:^(NSError *_Nullable error) { if (error) { NSLog(@"[InAppScreenCapturer] stopCapture error: %@", error.localizedDescription); } @@ -107,10 +109,12 @@ - (void)stopCapture { #pragma mark - App Lifecycle - (void)registerAppStateObservers { - if (_observingAppState) return; + if (_observingAppState) + return; dispatch_async(dispatch_get_main_queue(), ^{ - if (self->_observingAppState || !self->_capturing) return; + if (self->_observingAppState || !self->_capturing) + return; self->_observingAppState = YES; [[NSNotificationCenter defaultCenter] addObserver:self @@ -125,12 +129,11 @@ - (void)registerAppStateObservers { } - (void)unregisterAppStateObservers { - if (!_observingAppState) return; + if (!_observingAppState) + return; _observingAppState = NO; - [[NSNotificationCenter defaultCenter] removeObserver:self - name:UIApplicationDidBecomeActiveNotification - object:nil]; + [[NSNotificationCenter defaultCenter] removeObserver:self name:UIApplicationDidBecomeActiveNotification object:nil]; [[NSNotificationCenter defaultCenter] removeObserver:self name:UIApplicationWillResignActiveNotification object:nil]; @@ -140,7 +143,7 @@ - (void)appWillResignActive { if (_capturing) { _shouldResumeOnForeground = YES; // Stop the RPScreenRecorder session — iOS suspends it in background anyway - [[RPScreenRecorder sharedRecorder] stopCaptureWithHandler:^(NSError * _Nullable error) { + [[RPScreenRecorder sharedRecorder] stopCaptureWithHandler:^(NSError *_Nullable error) { if (error) { NSLog(@"[InAppScreenCapturer] background stop error: %@", error.localizedDescription); } diff --git a/ios/RCTWebRTC/RCTConvert+WebRTC.m b/ios/RCTWebRTC/RCTConvert+WebRTC.m index 363fe4add..9035b43a5 100644 --- a/ios/RCTWebRTC/RCTConvert+WebRTC.m +++ b/ios/RCTWebRTC/RCTConvert+WebRTC.m @@ -1,9 +1,12 @@ #import +#import #import #import #import #import "RCTConvert+WebRTC.h" +#import "WebRTCModule+RTCPeerConnection.h" + @implementation RCTConvert (WebRTC) + (RTCSessionDescription *)RTCSessionDescription:(id)json { @@ -172,6 +175,22 @@ + (nonnull RTCConfiguration *)RTCConfiguration:(id)json { } } + if (json[@"certificates"] != nil && [json[@"certificates"] isKindOfClass:[NSArray class]]) { + NSArray *certs = json[@"certificates"]; + if (certs.count > 0) { + id certInfo = certs[0]; + if ([certInfo isKindOfClass:[NSDictionary class]]) { + NSString *certId = certInfo[@"certificateId"]; + if (certId) { + RTCCertificate *cert = [WebRTCModule getCertificate:certId]; + if (cert) { + config.certificate = cert; + } + } + } + } + } + return config; } diff --git a/ios/RCTWebRTC/RTCVideoViewManager.m b/ios/RCTWebRTC/RTCVideoViewManager.m index a1b74e33b..102a1e58f 100644 --- a/ios/RCTWebRTC/RTCVideoViewManager.m +++ b/ios/RCTWebRTC/RTCVideoViewManager.m @@ -135,14 +135,14 @@ - (instancetype)initWithFrame:(CGRect)frame { #if TARGET_OS_OSX - (void)layout { - [super layout]; + [super layout]; #else - (void)layoutSubviews { - [super layoutSubviews]; + [super layoutSubviews]; #endif - CGRect bounds = self.bounds; - self.videoView.frame = bounds; + CGRect bounds = self.bounds; + self.videoView.frame = bounds; } /** diff --git a/ios/RCTWebRTC/ScreenCaptureController.m b/ios/RCTWebRTC/ScreenCaptureController.m index 2e002f785..caab33c10 100644 --- a/ios/RCTWebRTC/ScreenCaptureController.m +++ b/ios/RCTWebRTC/ScreenCaptureController.m @@ -55,11 +55,7 @@ - (void)stopCapture { } - (NSDictionary *)getSettings { - return @{ - @"deviceId": self.deviceId, - @"groupId": @"", - @"frameRate" : @(30) - }; + return @{@"deviceId" : self.deviceId, @"groupId" : @"", @"frameRate" : @(30)}; } // MARK: CapturerEventsDelegate Methods diff --git a/ios/RCTWebRTC/SerializeUtils.m b/ios/RCTWebRTC/SerializeUtils.m index c4deef1b0..017995d2f 100644 --- a/ios/RCTWebRTC/SerializeUtils.m +++ b/ios/RCTWebRTC/SerializeUtils.m @@ -106,6 +106,9 @@ + (NSDictionary *)parametersToJSON:(RTCRtpParameters *)params { if (encoding.maxBitrateBps) { encodingDictionary[@"maxBitrate"] = encoding.maxBitrateBps; } + if (encoding.minBitrateBps) { + encodingDictionary[@"minBitrate"] = encoding.minBitrateBps; + } if (encoding.maxFramerate) { encodingDictionary[@"maxFramerate"] = encoding.maxFramerate; } @@ -254,6 +257,9 @@ + (RTCRtpEncodingParameters *)parseEncoding:(NSDictionary *)params { if (params[@"maxBitrate"] != nil) { [encoding setMaxBitrateBps:(NSNumber *)params[@"maxBitrate"]]; } + if (params[@"minBitrate"] != nil) { + [encoding setMinBitrateBps:(NSNumber *)params[@"minBitrate"]]; + } if (params[@"maxFramerate"] != nil) { [encoding setMaxFramerate:(NSNumber *)params[@"maxFramerate"]]; } diff --git a/ios/RCTWebRTC/VideoCaptureController.h b/ios/RCTWebRTC/VideoCaptureController.h index 335c80920..7df95c952 100644 --- a/ios/RCTWebRTC/VideoCaptureController.h +++ b/ios/RCTWebRTC/VideoCaptureController.h @@ -6,7 +6,7 @@ #import "CaptureController.h" @interface VideoCaptureController : CaptureController -@property (nonatomic, readonly, strong) RTCCameraVideoCapturer *capturer; +@property(nonatomic, readonly, strong) RTCCameraVideoCapturer *capturer; @property(nonatomic, readonly, strong) AVCaptureDeviceFormat *selectedFormat; @property(nonatomic, readonly, assign) int frameRate; @property(nonatomic, assign) BOOL enableMultitaskingCameraAccess; diff --git a/ios/RCTWebRTC/VideoCaptureController.m b/ios/RCTWebRTC/VideoCaptureController.m index b0322c28a..8ae5ce194 100644 --- a/ios/RCTWebRTC/VideoCaptureController.m +++ b/ios/RCTWebRTC/VideoCaptureController.m @@ -24,6 +24,7 @@ - (instancetype)initWithCapturer:(RTCCameraVideoCapturer *)capturer andConstrain if (self) { self.capturer = capturer; self.running = NO; + [self determineDevice:constraints]; [self applyConstraints:constraints error:nil]; } @@ -68,7 +69,7 @@ - (void)startCapture { BOOL shouldChange = session.multitaskingCameraAccessEnabled != enable; BOOL canChange = !enable || (enable && session.isMultitaskingCameraAccessSupported); - if(shouldChange && canChange) { + if (shouldChange && canChange) { [session beginConfiguration]; [session setMultitaskingCameraAccessEnabled:enable]; [session commitConfiguration]; @@ -117,32 +118,10 @@ - (void)stopCapture { dispatch_semaphore_wait(semaphore, DISPATCH_TIME_FOREVER); } -- (void)applyConstraints:(NSDictionary *)constraints error:(NSError **)outError { +- (void)determineDevice:(NSDictionary *)constraints { // Clear device to prepare for starting camera with new constraints. self.device = nil; - - BOOL hasChanged = NO; - NSString *deviceId = constraints[@"deviceId"]; - int width = [constraints[@"width"] intValue]; - int height = [constraints[@"height"] intValue]; - int frameRate = [constraints[@"frameRate"] intValue]; - - if (self.width != width) { - hasChanged = YES; - self.width = width; - } - - if (self.height != height) { - hasChanged = YES; - self.height = height; - } - - if (self.frameRate != frameRate) { - hasChanged = YES; - self.frameRate = frameRate; - } - id facingMode = constraints[@"facingMode"]; if (!facingMode && !deviceId) { @@ -164,7 +143,6 @@ - (void)applyConstraints:(NSDictionary *)constraints error:(NSError **)outError BOOL usingFrontCamera = position == AVCaptureDevicePositionFront; if (self.usingFrontCamera != usingFrontCamera) { - hasChanged = YES; self.usingFrontCamera = usingFrontCamera; } } @@ -174,12 +152,33 @@ - (void)applyConstraints:(NSDictionary *)constraints error:(NSError **)outError self.usingFrontCamera ? AVCaptureDevicePositionFront : AVCaptureDevicePositionBack; deviceId = [self findDeviceForPosition:position].uniqueID; } - + if (self.deviceId != deviceId && ![self.deviceId isEqualToString:deviceId]) { - hasChanged = YES; self.deviceId = deviceId; } +} + +- (void)applyConstraints:(NSDictionary *)constraints error:(NSError **)outError { + BOOL hasChanged = NO; + + int width = [constraints[@"width"] intValue]; + int height = [constraints[@"height"] intValue]; + int frameRate = [constraints[@"frameRate"] intValue]; + + if (self.width != width) { + hasChanged = YES; + self.width = width; + } + if (self.height != height) { + hasChanged = YES; + self.height = height; + } + + if (self.frameRate != frameRate) { + hasChanged = YES; + self.frameRate = frameRate; + } if (self.running && hasChanged) { [self stopCapture]; @@ -191,7 +190,7 @@ - (NSDictionary *)getSettings { AVCaptureDeviceFormat *format = self.selectedFormat; CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions(format.formatDescription); NSMutableDictionary *settings = [[NSMutableDictionary alloc] initWithDictionary:@{ - @"groupId": @"", + @"groupId" : @"", @"height" : @(dimensions.height), @"width" : @(dimensions.width), @"frameRate" : @(30), diff --git a/ios/RCTWebRTC/WebRTCModule+RTCAudioDeviceModule.m b/ios/RCTWebRTC/WebRTCModule+RTCAudioDeviceModule.m index b531781c7..72c15764e 100644 --- a/ios/RCTWebRTC/WebRTCModule+RTCAudioDeviceModule.m +++ b/ios/RCTWebRTC/WebRTCModule+RTCAudioDeviceModule.m @@ -30,37 +30,61 @@ - (void)handleADMResult:(NSInteger)result RCT_EXPORT_METHOD(audioDeviceModuleStartPlayout : (RCTPromiseResolveBlock)resolve rejecter : (RCTPromiseRejectBlock)reject) { - [self handleADMResult:[RAW_ADM startPlayout] operation:@"start playout" code:@"playout_error" resolve:resolve reject:reject]; + [self handleADMResult:[RAW_ADM startPlayout] + operation:@"start playout" + code:@"playout_error" + resolve:resolve + reject:reject]; } RCT_EXPORT_METHOD(audioDeviceModuleStopPlayout : (RCTPromiseResolveBlock)resolve rejecter : (RCTPromiseRejectBlock)reject) { - [self handleADMResult:[RAW_ADM stopPlayout] operation:@"stop playout" code:@"playout_error" resolve:resolve reject:reject]; + [self handleADMResult:[RAW_ADM stopPlayout] + operation:@"stop playout" + code:@"playout_error" + resolve:resolve + reject:reject]; } RCT_EXPORT_METHOD(audioDeviceModuleStartRecording : (RCTPromiseResolveBlock)resolve rejecter : (RCTPromiseRejectBlock)reject) { - [self handleADMResult:[RAW_ADM startRecording] operation:@"start recording" code:@"recording_error" resolve:resolve reject:reject]; + [self handleADMResult:[RAW_ADM startRecording] + operation:@"start recording" + code:@"recording_error" + resolve:resolve + reject:reject]; } RCT_EXPORT_METHOD(audioDeviceModuleStopRecording : (RCTPromiseResolveBlock)resolve rejecter : (RCTPromiseRejectBlock)reject) { - [self handleADMResult:[RAW_ADM stopRecording] operation:@"stop recording" code:@"recording_error" resolve:resolve reject:reject]; + [self handleADMResult:[RAW_ADM stopRecording] + operation:@"stop recording" + code:@"recording_error" + resolve:resolve + reject:reject]; } RCT_EXPORT_METHOD(audioDeviceModuleStartLocalRecording : (RCTPromiseResolveBlock)resolve rejecter : (RCTPromiseRejectBlock)reject) { - [self handleADMResult:[RAW_ADM initAndStartRecording] operation:@"start local recording" code:@"recording_error" resolve:resolve reject:reject]; + [self handleADMResult:[RAW_ADM initAndStartRecording] + operation:@"start local recording" + code:@"recording_error" + resolve:resolve + reject:reject]; } RCT_EXPORT_METHOD(audioDeviceModuleStopLocalRecording : (RCTPromiseResolveBlock)resolve rejecter : (RCTPromiseRejectBlock)reject) { - [self handleADMResult:[RAW_ADM stopRecording] operation:@"stop local recording" code:@"recording_error" resolve:resolve reject:reject]; + [self handleADMResult:[RAW_ADM stopRecording] + operation:@"stop local recording" + code:@"recording_error" + resolve:resolve + reject:reject]; } #pragma mark - Microphone Control @@ -69,7 +93,11 @@ - (void)handleADMResult:(NSInteger)result : (BOOL)muted resolver : (RCTPromiseResolveBlock)resolve rejecter : (RCTPromiseRejectBlock)reject) { - [self handleADMResult:[RAW_ADM setMicrophoneMuted:muted] operation:@"set microphone mute" code:@"mute_error" resolve:resolve reject:reject]; + [self handleADMResult:[RAW_ADM setMicrophoneMuted:muted] + operation:@"set microphone mute" + code:@"mute_error" + resolve:resolve + reject:reject]; } RCT_EXPORT_BLOCKING_SYNCHRONOUS_METHOD(audioDeviceModuleIsMicrophoneMuted) { @@ -82,7 +110,11 @@ - (void)handleADMResult:(NSInteger)result : (BOOL)enabled resolver : (RCTPromiseResolveBlock)resolve rejecter : (RCTPromiseRejectBlock)reject) { - [self handleADMResult:[RAW_ADM setVoiceProcessingEnabled:enabled] operation:@"set voice processing" code:@"voice_processing_error" resolve:resolve reject:reject]; + [self handleADMResult:[RAW_ADM setVoiceProcessingEnabled:enabled] + operation:@"set voice processing" + code:@"voice_processing_error" + resolve:resolve + reject:reject]; } RCT_EXPORT_BLOCKING_SYNCHRONOUS_METHOD(audioDeviceModuleIsVoiceProcessingEnabled) { @@ -127,7 +159,11 @@ - (void)handleADMResult:(NSInteger)result : (NSInteger)mode resolver : (RCTPromiseResolveBlock)resolve rejecter : (RCTPromiseRejectBlock)reject) { - [self handleADMResult:[RAW_ADM setMuteMode:(RTCAudioEngineMuteMode)mode] operation:@"set mute mode" code:@"mute_mode_error" resolve:resolve reject:reject]; + [self handleADMResult:[RAW_ADM setMuteMode:(RTCAudioEngineMuteMode)mode] + operation:@"set mute mode" + code:@"mute_mode_error" + resolve:resolve + reject:reject]; } RCT_EXPORT_BLOCKING_SYNCHRONOUS_METHOD(audioDeviceModuleGetMuteMode) { @@ -160,7 +196,11 @@ - (void)handleADMResult:(NSInteger)result : (BOOL)enabled resolver : (RCTPromiseResolveBlock)resolve rejecter : (RCTPromiseRejectBlock)reject) { - [self handleADMResult:[RAW_ADM setRecordingAlwaysPreparedMode:enabled] operation:@"set recording always prepared mode" code:@"recording_always_prepared_mode_error" resolve:resolve reject:reject]; + [self handleADMResult:[RAW_ADM setRecordingAlwaysPreparedMode:enabled] + operation:@"set recording always prepared mode" + code:@"recording_always_prepared_mode_error" + resolve:resolve + reject:reject]; } // TODO: `getEngineAvailability` / `setEngineAvailability` were dropped because the diff --git a/ios/RCTWebRTC/WebRTCModule+RTCFrameCryptor.m b/ios/RCTWebRTC/WebRTCModule+RTCFrameCryptor.m new file mode 100644 index 000000000..dd3ce112e --- /dev/null +++ b/ios/RCTWebRTC/WebRTCModule+RTCFrameCryptor.m @@ -0,0 +1,611 @@ +#include +#import + +#import +#import + +#import "WebRTCModule+RTCPeerConnection.h" +#import "WebRTCModule.h" + +// Key for objc_set/getAssociatedObject, value of NSString* +static char frameCryptorUUIDKey; + +@interface WebRTCModule () +@end + +@implementation WebRTCModule (RTCFrameCryptor) + +- (RTCCryptorAlgorithm)getAlgorithm:(NSNumber *)algorithm { + switch ([algorithm intValue]) { + // case 0: + // return RTCCryptorAlgorithmAesGcm; + // case 1: + // return RTCCryptorAlgorithmAesCbc; + default: + return RTCCryptorAlgorithmAesGcm; + } +} + +- (NSData *)bytesFromMap:(NSDictionary *)map key:(NSString *)key isBase64Key:(nullable NSString *)isBase64Key { + BOOL isBase64 = YES; + if (isBase64Key) { + isBase64 = [map[isBase64Key] boolValue]; + } + + if (isBase64) { + return [[NSData alloc] initWithBase64EncodedString:map[key] options:0]; + } else { + return [map[key] dataUsingEncoding:NSUTF8StringEncoding]; + } +} + +RCT_EXPORT_BLOCKING_SYNCHRONOUS_METHOD(frameCryptorFactoryCreateFrameCryptor : (nonnull NSDictionary *)constraints) { + __block NSString *frameCryptorId = nil; + dispatch_sync(self.workerQueue, ^{ + NSNumber *peerConnectionId = constraints[@"peerConnectionId"]; + NSNumber *algorithm = constraints[@"algorithm"]; + if (algorithm == nil) { + NSLog(@"frameCryptorFactoryCreateFrameCryptorFailed: Invalid algorithm"); + return; + } + + NSString *participantId = constraints[@"participantId"]; + if (participantId == nil) { + NSLog(@"frameCryptorFactoryCreateFrameCryptorFailed: Invalid participantId"); + return; + } + + NSString *keyProviderId = constraints[@"keyProviderId"]; + if (keyProviderId == nil) { + NSLog(@"frameCryptorFactoryCreateFrameCryptorFailed: Invalid keyProviderId"); + return; + } + + RTCFrameCryptorKeyProvider *keyProvider = self.keyProviders[keyProviderId]; + if (keyProvider == nil) { + NSLog(@"frameCryptorFactoryCreateFrameCryptorFailed: Invalid keyProvider"); + return; + } + + NSString *type = constraints[@"type"]; + NSString *rtpSenderId = constraints[@"rtpSenderId"]; + NSString *rtpReceiverId = constraints[@"rtpReceiverId"]; + + if ([type isEqualToString:@"sender"]) { + RTCRtpSender *sender = [self getSenderByPeerConnectionId:peerConnectionId senderId:rtpSenderId]; + + if (sender == nil) { + NSLog(@"frameCryptorFactoryCreateFrameCryptorFailed: Error: sender not found!"); + return; + } + + RTCFrameCryptor *frameCryptor = [[RTCFrameCryptor alloc] initWithFactory:self.peerConnectionFactory + rtpSender:sender + participantId:participantId + algorithm:[self getAlgorithm:algorithm] + keyProvider:keyProvider]; + frameCryptorId = [[NSUUID UUID] UUIDString]; + + frameCryptor.delegate = self; + + self.frameCryptors[frameCryptorId] = frameCryptor; + objc_setAssociatedObject(frameCryptor, &frameCryptorUUIDKey, frameCryptorId, OBJC_ASSOCIATION_COPY); + return; + } else if ([type isEqualToString:@"receiver"]) { + RTCRtpReceiver *receiver = [self getReceiverByPeerConnectionId:peerConnectionId receiverId:rtpReceiverId]; + if (receiver == nil) { + NSLog(@"frameCryptorFactoryCreateFrameCryptorFailed: Error: receiver not found!"); + return; + } + RTCFrameCryptor *frameCryptor = [[RTCFrameCryptor alloc] initWithFactory:self.peerConnectionFactory + rtpReceiver:receiver + participantId:participantId + algorithm:[self getAlgorithm:algorithm] + keyProvider:keyProvider]; + frameCryptorId = [[NSUUID UUID] UUIDString]; + + frameCryptor.delegate = self; + + self.frameCryptors[frameCryptorId] = frameCryptor; + objc_setAssociatedObject(frameCryptor, &frameCryptorUUIDKey, frameCryptorId, OBJC_ASSOCIATION_COPY); + return; + } else { + NSLog(@"InvalidArgument: Invalid type"); + return; + } + }); + + return frameCryptorId; +} + +RCT_EXPORT_METHOD(frameCryptorSetKeyIndex + : (nonnull NSDictionary *)constraints resolver + : (RCTPromiseResolveBlock)resolve rejecter + : (RCTPromiseRejectBlock)reject) { + NSString *frameCryptorId = constraints[@"frameCryptorId"]; + if (frameCryptorId == nil) { + reject(@"frameCryptorSetKeyIndexFailed", @"Invalid frameCryptorId", nil); + return; + } + RTCFrameCryptor *frameCryptor = self.frameCryptors[frameCryptorId]; + if (frameCryptor == nil) { + reject(@"frameCryptorSetKeyIndexFailed", @"Invalid frameCryptor", nil); + return; + } + + NSNumber *keyIndex = constraints[@"keyIndex"]; + if (keyIndex == nil) { + reject(@"frameCryptorSetKeyIndexFailed", @"Invalid keyIndex", nil); + return; + } + [frameCryptor setKeyIndex:[keyIndex intValue]]; + resolve(@{@"result" : @YES}); +} + +RCT_EXPORT_METHOD(frameCryptorGetKeyIndex + : (nonnull NSDictionary *)constraints resolver + : (RCTPromiseResolveBlock)resolve rejecter + : (RCTPromiseRejectBlock)reject) { + NSString *frameCryptorId = constraints[@"frameCryptorId"]; + if (frameCryptorId == nil) { + reject(@"frameCryptorGetKeyIndexFailed", @"Invalid frameCryptorId", nil); + return; + } + RTCFrameCryptor *frameCryptor = self.frameCryptors[frameCryptorId]; + if (frameCryptor == nil) { + reject(@"frameCryptorGetKeyIndexFailed", @"Invalid frameCryptor", nil); + return; + } + resolve(@{@"keyIndex" : [NSNumber numberWithInt:frameCryptor.keyIndex]}); +} + +RCT_EXPORT_METHOD(frameCryptorSetEnabled + : (nonnull NSDictionary *)constraints resolver + : (RCTPromiseResolveBlock)resolve rejecter + : (RCTPromiseRejectBlock)reject) { + NSString *frameCryptorId = constraints[@"frameCryptorId"]; + if (frameCryptorId == nil) { + reject(@"frameCryptorSetEnabledFailed", @"Invalid frameCryptorId", nil); + return; + } + RTCFrameCryptor *frameCryptor = self.frameCryptors[frameCryptorId]; + if (frameCryptor == nil) { + reject(@"frameCryptorSetEnabledFailed", @"Invalid frameCryptor", nil); + return; + } + + NSNumber *enabled = constraints[@"enabled"]; + if (enabled == nil) { + reject(@"frameCryptorSetEnabledFailed", @"Invalid enabled", nil); + return; + } + frameCryptor.enabled = [enabled boolValue]; + resolve(@{@"result" : enabled}); +} + +RCT_EXPORT_METHOD(frameCryptorGetEnabled + : (nonnull NSDictionary *)constraints resolver + : (RCTPromiseResolveBlock)resolve rejecter + : (RCTPromiseRejectBlock)reject) { + NSString *frameCryptorId = constraints[@"frameCryptorId"]; + if (frameCryptorId == nil) { + reject(@"frameCryptorGetEnabledFailed", @"Invalid frameCryptorId", nil); + return; + } + RTCFrameCryptor *frameCryptor = self.frameCryptors[frameCryptorId]; + if (frameCryptor == nil) { + reject(@"frameCryptorGetEnabledFailed", @"Invalid frameCryptor", nil); + return; + } + resolve(@{@"enabled" : [NSNumber numberWithBool:frameCryptor.enabled]}); +} + +RCT_EXPORT_METHOD(frameCryptorDispose + : (nonnull NSDictionary *)constraints resolver + : (RCTPromiseResolveBlock)resolve rejecter + : (RCTPromiseRejectBlock)reject) { + NSString *frameCryptorId = constraints[@"frameCryptorId"]; + if (frameCryptorId == nil) { + reject(@"frameCryptorDisposeFailed", @"Invalid frameCryptorId", nil); + return; + } + RTCFrameCryptor *frameCryptor = self.frameCryptors[frameCryptorId]; + if (frameCryptor == nil) { + reject(@"frameCryptorDisposeFailed", @"Invalid frameCryptor", nil); + return; + } + [self.frameCryptors removeObjectForKey:frameCryptorId]; + frameCryptor.enabled = NO; + resolve(@{@"result" : @"success"}); +} + +RCT_EXPORT_BLOCKING_SYNCHRONOUS_METHOD(frameCryptorFactoryCreateKeyProvider + : (nonnull NSDictionary *)keyProviderOptions) { + __block NSString *keyProviderId = [[NSUUID UUID] UUIDString]; + + dispatch_sync(self.workerQueue, ^{ + NSNumber *sharedKey = keyProviderOptions[@"sharedKey"]; + if (sharedKey == nil) { + NSLog(@"frameCryptorFactoryCreateKeyProviderFailed: Invalid sharedKey"); + keyProviderId = nil; + return; + } + + if (keyProviderOptions[@"ratchetSalt"] == nil) { + NSLog(@"frameCryptorFactoryCreateKeyProviderFailed: Invalid ratchetSalt"); + keyProviderId = nil; + return; + } + NSData *ratchetSalt = [self bytesFromMap:keyProviderOptions + key:@"ratchetSalt" + isBase64Key:@"ratchetSaltIsBase64"]; + + NSNumber *ratchetWindowSize = keyProviderOptions[@"ratchetWindowSize"]; + if (ratchetWindowSize == nil) { + NSLog(@"frameCryptorFactoryCreateKeyProviderFailed: Invalid ratchetWindowSize"); + keyProviderId = nil; + return; + } + + NSNumber *failureTolerance = keyProviderOptions[@"failureTolerance"]; + NSData *uncryptedMagicBytes = nil; + + if (keyProviderOptions[@"uncryptedMagicBytes"] != nil) { + uncryptedMagicBytes = [[NSData alloc] initWithBase64EncodedString:keyProviderOptions[@"uncryptedMagicBytes"] + options:0]; + } + + NSNumber *keyRingSize = keyProviderOptions[@"keyRingSize"]; + NSNumber *discardFrameWhenCryptorNotReady = keyProviderOptions[@"discardFrameWhenCryptorNotReady"]; + + RTCFrameCryptorKeyProvider *keyProvider = [[RTCFrameCryptorKeyProvider alloc] + initWithRatchetSalt:ratchetSalt + ratchetWindowSize:[ratchetWindowSize intValue] + sharedKeyMode:[sharedKey boolValue] + uncryptedMagicBytes:uncryptedMagicBytes + failureTolerance:failureTolerance != nil ? [failureTolerance intValue] : -1 + keyRingSize:keyRingSize != nil ? [keyRingSize intValue] : 0 + discardFrameWhenCryptorNotReady:discardFrameWhenCryptorNotReady != nil + ? [discardFrameWhenCryptorNotReady boolValue] + : NO]; + self.keyProviders[keyProviderId] = keyProvider; + return; + }); + return keyProviderId; +} + +- (nullable RTCFrameCryptorKeyProvider *)getKeyProviderForId:(NSString *)keyProviderId + rejecter:(RCTPromiseRejectBlock)reject { + if (keyProviderId == nil) { + reject(@"getKeyProviderForIdFailed", @"Invalid keyProviderId", nil); + return nil; + } + RTCFrameCryptorKeyProvider *keyProvider = self.keyProviders[keyProviderId]; + if (keyProvider == nil) { + reject(@"getKeyProviderForIdFailed", @"Invalid keyProvider", nil); + return nil; + } + return keyProvider; +} + +RCT_EXPORT_METHOD(keyProviderSetSharedKey + : (nonnull NSDictionary *)constraints resolver + : (RCTPromiseResolveBlock)resolve rejecter + : (RCTPromiseRejectBlock)reject) { + RTCFrameCryptorKeyProvider *keyProvider = [self getKeyProviderForId:constraints[@"keyProviderId"] rejecter:reject]; + if (keyProvider == nil) { + return; + } + + NSNumber *keyIndex = constraints[@"keyIndex"]; + if (keyIndex == nil) { + reject(@"keyProviderSetSharedKey", @"Invalid keyIndex", nil); + return; + } + + if (constraints[@"key"] == nil) { + reject(@"keyProviderSetSharedKey", @"Invalid key", nil); + return; + } + NSData *key = [self bytesFromMap:constraints key:@"key" isBase64Key:@"keyIsBase64"]; + + [keyProvider setSharedKey:key withIndex:[keyIndex intValue]]; + resolve(@{@"result" : @YES}); +} + +RCT_EXPORT_METHOD(keyProviderRatchetSharedKey + : (nonnull NSDictionary *)constraints resolver + : (RCTPromiseResolveBlock)resolve rejecter + : (RCTPromiseRejectBlock)reject) { + RTCFrameCryptorKeyProvider *keyProvider = [self getKeyProviderForId:constraints[@"keyProviderId"] rejecter:reject]; + if (keyProvider == nil) { + return; + } + + NSNumber *keyIndex = constraints[@"keyIndex"]; + if (keyIndex == nil) { + reject(@"keyProviderRatchetSharedKeyFailed", @"Invalid keyIndex", nil); + return; + } + + NSData *newKey = [keyProvider ratchetSharedKey:[keyIndex intValue]]; + resolve(@{@"result" : [newKey base64EncodedStringWithOptions:0]}); +} + +RCT_EXPORT_METHOD(keyProviderExportSharedKey + : (nonnull NSDictionary *)constraints resolver + : (RCTPromiseResolveBlock)resolve rejecter + : (RCTPromiseRejectBlock)reject) { + RTCFrameCryptorKeyProvider *keyProvider = [self getKeyProviderForId:constraints[@"keyProviderId"] rejecter:reject]; + if (keyProvider == nil) { + return; + } + + NSNumber *keyIndex = constraints[@"keyIndex"]; + if (keyIndex == nil) { + reject(@"keyProviderExportSharedKeyFailed", @"Invalid keyIndex", nil); + return; + } + + NSData *key = [keyProvider exportSharedKey:[keyIndex intValue]]; + resolve(@{@"result" : [key base64EncodedStringWithOptions:0]}); +} + +RCT_EXPORT_METHOD(keyProviderSetKey + : (nonnull NSDictionary *)constraints resolver + : (RCTPromiseResolveBlock)resolve rejecter + : (RCTPromiseRejectBlock)reject) { + RTCFrameCryptorKeyProvider *keyProvider = [self getKeyProviderForId:constraints[@"keyProviderId"] rejecter:reject]; + if (keyProvider == nil) { + return; + } + + NSNumber *keyIndex = constraints[@"keyIndex"]; + if (keyIndex == nil) { + reject(@"keyProviderSetKeyFailed", @"Invalid keyIndex", nil); + return; + } + + if (constraints[@"key"] == nil) { + reject(@"keyProviderSetKeyFailed", @"Invalid key", nil); + return; + } + NSData *key = [self bytesFromMap:constraints key:@"key" isBase64Key:@"keyIsBase64"]; + + NSString *participantId = constraints[@"participantId"]; + if (participantId == nil) { + reject(@"keyProviderSetKeyFailed", @"Invalid participantId", nil); + return; + } + + [keyProvider setKey:key withIndex:[keyIndex intValue] forParticipant:participantId]; + resolve(@{@"result" : @YES}); +} + +RCT_EXPORT_METHOD(keyProviderRatchetKey + : (nonnull NSDictionary *)constraints resolver + : (RCTPromiseResolveBlock)resolve rejecter + : (RCTPromiseRejectBlock)reject) { + RTCFrameCryptorKeyProvider *keyProvider = [self getKeyProviderForId:constraints[@"keyProviderId"] rejecter:reject]; + if (keyProvider == nil) { + return; + } + + NSNumber *keyIndex = constraints[@"keyIndex"]; + if (keyIndex == nil) { + reject(@"keyProviderRatchetKeyFailed", @"Invalid keyIndex", nil); + return; + } + + NSString *participantId = constraints[@"participantId"]; + if (participantId == nil) { + reject(@"keyProviderRatchetKeyFailed", @"Invalid participantId", nil); + return; + } + + NSData *newKey = [keyProvider ratchetKey:participantId withIndex:[keyIndex intValue]]; + resolve(@{@"result" : [newKey base64EncodedStringWithOptions:0]}); +} + +RCT_EXPORT_METHOD(keyProviderExportKey + : (nonnull NSDictionary *)constraints resolver + : (RCTPromiseResolveBlock)resolve rejecter + : (RCTPromiseRejectBlock)reject) { + RTCFrameCryptorKeyProvider *keyProvider = [self getKeyProviderForId:constraints[@"keyProviderId"] rejecter:reject]; + if (keyProvider == nil) { + return; + } + + NSNumber *keyIndex = constraints[@"keyIndex"]; + if (keyIndex == nil) { + reject(@"keyProviderExportKeyFailed", @"Invalid keyIndex", nil); + return; + } + + NSString *participantId = constraints[@"participantId"]; + if (participantId == nil) { + reject(@"keyProviderExportKeyFailed", @"Invalid participantId", nil); + return; + } + + NSData *key = [keyProvider exportKey:participantId withIndex:[keyIndex intValue]]; + resolve(@{@"result" : [key base64EncodedStringWithOptions:0]}); +} + +RCT_EXPORT_METHOD(keyProviderSetSifTrailer + : (nonnull NSDictionary *)constraints resolver + : (RCTPromiseResolveBlock)resolve rejecter + : (RCTPromiseRejectBlock)reject) { + RTCFrameCryptorKeyProvider *keyProvider = [self getKeyProviderForId:constraints[@"keyProviderId"] rejecter:reject]; + if (keyProvider == nil) { + return; + } + + if (constraints[@"sifTrailer"] == nil) { + reject(@"keyProviderSetSifTrailerFailed", @"Invalid key", nil); + return; + } + NSData *sifTrailer = [[NSData alloc] initWithBase64EncodedString:constraints[@"sifTrailer"] options:0]; + + [keyProvider setSifTrailer:sifTrailer]; + resolve(nil); +} + +RCT_EXPORT_METHOD(keyProviderDispose + : (nonnull NSDictionary *)constraints resolver + : (RCTPromiseResolveBlock)resolve rejecter + : (RCTPromiseRejectBlock)reject) { + NSString *keyProviderId = constraints[@"keyProviderId"]; + if (keyProviderId == nil) { + reject(@"getKeyProviderForIdFailed", @"Invalid keyProviderId", nil); + return; + } + [self.keyProviders removeObjectForKey:keyProviderId]; + resolve(@{@"result" : @"success"}); +} + +- (NSString *)stringFromState:(RTCFrameCryptorState)state { + switch (state) { + case RTCFrameCryptorStateNew: + return @"new"; + case RTCFrameCryptorStateOk: + return @"ok"; + case RTCFrameCryptorStateEncryptionFailed: + return @"encryptionFailed"; + case RTCFrameCryptorStateDecryptionFailed: + return @"decryptionFailed"; + case RTCFrameCryptorStateMissingKey: + return @"missingKey"; + case RTCFrameCryptorStateKeyRatcheted: + return @"keyRatcheted"; + case RTCFrameCryptorStateInternalError: + return @"internalError"; + default: + return @"unknown"; + } +} + +RCT_EXPORT_METHOD(dataPacketCryptorFactoryCreateDataPacketCryptor + : (nonnull NSDictionary *)constraints resolver + : (RCTPromiseResolveBlock)resolve rejecter + : (RCTPromiseRejectBlock)reject) { + NSNumber *algorithm = constraints[@"algorithm"]; + NSString *keyProviderId = constraints[@"keyProviderId"]; + if (keyProviderId == nil) { + reject(@"dataPacketCryptorFactoryCreateDataPacketCryptorFailed", @"Invalid keyProviderId", nil); + return; + } + + RTCFrameCryptorKeyProvider *keyProvider = self.keyProviders[keyProviderId]; + if (keyProvider == nil) { + reject(@"getKeyProviderForIdFailed", @"Invalid keyProviderId", nil); + return; + } + + RTCDataPacketCryptor *cryptor = [[RTCDataPacketCryptor alloc] initWithAlgorithm:[self getAlgorithm:algorithm] + keyProvider:keyProvider]; + NSString *cryptorId = [[NSUUID UUID] UUIDString]; + + self.dataPacketCryptors[cryptorId] = cryptor; + + resolve(@{@"dataPacketCryptorId" : cryptorId}); +} + +RCT_EXPORT_METHOD(dataPacketCryptorEncrypt + : (nonnull NSDictionary *)constraints resolver + : (RCTPromiseResolveBlock)resolve rejecter + : (RCTPromiseRejectBlock)reject) { + NSString *cryptorId = constraints[@"dataPacketCryptorId"]; + NSString *participantId = constraints[@"participantId"]; + NSNumber *keyIndex = constraints[@"keyIndex"]; + NSData *data = [self bytesFromMap:constraints key:@"data" isBase64Key:nil]; + + RTCDataPacketCryptor *cryptor = self.dataPacketCryptors[cryptorId]; + + if (cryptor == nil) { + reject(@"dataPacketCryptorEncryptFailed", @"data packet cryptor not found", nil); + return; + } + + RTCEncryptedPacket *packet = [cryptor encrypt:participantId keyIndex:[keyIndex unsignedIntValue] data:data]; + + if (packet == nil) { + reject(@"dataPacketCryptorEncryptFailed", @"packet encryption failed", nil); + return; + } + + resolve(@{ + @"payload" : [packet.data base64EncodedStringWithOptions:0], + @"iv" : [packet.iv base64EncodedStringWithOptions:0], + @"keyIndex" : [NSNumber numberWithUnsignedInt:packet.keyIndex] + }); +} + +RCT_EXPORT_METHOD(dataPacketCryptorDecrypt + : (nonnull NSDictionary *)constraints resolver + : (RCTPromiseResolveBlock)resolve rejecter + : (RCTPromiseRejectBlock)reject) { + NSString *cryptorId = constraints[@"dataPacketCryptorId"]; + NSString *participantId = constraints[@"participantId"]; + NSNumber *keyIndex = constraints[@"keyIndex"]; + NSData *payload = [self bytesFromMap:constraints key:@"payload" isBase64Key:nil]; + NSData *iv = [self bytesFromMap:constraints key:@"iv" isBase64Key:nil]; + + RTCDataPacketCryptor *cryptor = self.dataPacketCryptors[cryptorId]; + + if (cryptor == nil) { + reject(@"dataPacketCryptorDecryptFailed", @"data packet cryptor not found", nil); + return; + } + + RTCEncryptedPacket *packet = [[RTCEncryptedPacket alloc] initWithData:payload + iv:iv + keyIndex:[keyIndex unsignedIntValue]]; + NSData *decryptedData = [cryptor decrypt:participantId encryptedPacket:packet]; + + if (decryptedData == nil) { + reject(@"dataPacketCryptorDecryptFailed", @"packet decryption failed", nil); + return; + } + + resolve(@{@"data" : [decryptedData base64EncodedStringWithOptions:0]}); +} +RCT_EXPORT_METHOD(dataPacketCryptorDispose + : (nonnull NSDictionary *)constraints resolver + : (RCTPromiseResolveBlock)resolve rejecter + : (RCTPromiseRejectBlock)reject) { + NSString *cryptorId = constraints[@"dataPacketCryptorId"]; + + RTCDataPacketCryptor *cryptor = self.dataPacketCryptors[cryptorId]; + + if (cryptor == nil) { + reject(@"dataPacketCryptorDisposeFailed", @"data packet cryptor not found", nil); + return; + } + + [self.dataPacketCryptors removeObjectForKey:cryptorId]; + resolve(@{@"result" : @"success"}); +} + +#pragma mark - RTCFrameCryptorDelegate methods + +- (void)frameCryptor:(RTC_OBJC_TYPE(RTCFrameCryptor) *)frameCryptor + didStateChangeWithParticipantId:(NSString *)participantId + withState:(RTCFrameCryptorState)stateChanged { + id frameCryptorId = objc_getAssociatedObject(frameCryptor, &frameCryptorUUIDKey); + + if (![frameCryptorId isKindOfClass:[NSString class]]) { + NSLog(@"Received frameCryptordidStateChangeWithParticipantId event for frame cryptor without UUID!"); + return; + } + + NSDictionary *event = @{ + @"event" : kEventFrameCryptionStateChanged, + @"participantId" : participantId, + @"frameCryptorId" : (NSString *)frameCryptorId, + @"state" : [self stringFromState:stateChanged] + }; + [self sendEventWithName:kEventFrameCryptionStateChanged body:event]; +} + +@end diff --git a/ios/RCTWebRTC/WebRTCModule+RTCMediaStream.h b/ios/RCTWebRTC/WebRTCModule+RTCMediaStream.h index 5b6353fd0..082ea1709 100644 --- a/ios/RCTWebRTC/WebRTCModule+RTCMediaStream.h +++ b/ios/RCTWebRTC/WebRTCModule+RTCMediaStream.h @@ -1,10 +1,10 @@ #import "CaptureController.h" -#import "WebRTCModule.h" #import "VideoEffectProcessor.h" +#import "WebRTCModule.h" @interface WebRTCModule (RTCMediaStream) -@property (nonatomic, strong) VideoEffectProcessor *videoEffectProcessor; +@property(nonatomic, strong) VideoEffectProcessor *videoEffectProcessor; - (RTCVideoTrack *)createVideoTrackWithCaptureController: (CaptureController * (^)(RTCVideoSource *))captureControllerCreator; @@ -13,4 +13,5 @@ - (void)addLocalVideoTrackDimensionDetection:(RTCVideoTrack *)videoTrack; - (void)removeLocalVideoTrackDimensionDetection:(RTCVideoTrack *)videoTrack; +- (RTCMediaStreamTrack *)trackForId:(nonnull NSString *)trackId pcId:(nonnull NSNumber *)pcId; @end \ No newline at end of file diff --git a/ios/RCTWebRTC/WebRTCModule+RTCMediaStream.m b/ios/RCTWebRTC/WebRTCModule+RTCMediaStream.m index c4562df70..c509584a5 100644 --- a/ios/RCTWebRTC/WebRTCModule+RTCMediaStream.m +++ b/ios/RCTWebRTC/WebRTCModule+RTCMediaStream.m @@ -7,14 +7,14 @@ #import #import "RTCMediaStreamTrack+React.h" -#import "WebRTCModuleOptions.h" #import "WebRTCModule+RTCMediaStream.h" #import "WebRTCModule+RTCPeerConnection.h" #import "WebRTCModule+VideoTrackAdapter.h" +#import "WebRTCModuleOptions.h" -#import "ProcessorProvider.h" #import "InAppScreenCaptureController.h" #import "InAppScreenCapturer.h" +#import "ProcessorProvider.h" #import "ScreenCaptureController.h" #import "ScreenCapturer.h" #import "TrackCapturerEventsEmitter.h" @@ -29,14 +29,13 @@ @implementation WebRTCModule (RTCMediaStream) -- (VideoEffectProcessor *)videoEffectProcessor -{ - return objc_getAssociatedObject(self, _cmd); +- (VideoEffectProcessor *)videoEffectProcessor { + return objc_getAssociatedObject(self, _cmd); } -- (void)setVideoEffectProcessor:(VideoEffectProcessor *)videoEffectProcessor -{ - objc_setAssociatedObject(self, @selector(videoEffectProcessor), videoEffectProcessor, OBJC_ASSOCIATION_RETAIN_NONATOMIC); +- (void)setVideoEffectProcessor:(VideoEffectProcessor *)videoEffectProcessor { + objc_setAssociatedObject( + self, @selector(videoEffectProcessor), videoEffectProcessor, OBJC_ASSOCIATION_RETAIN_NONATOMIC); } #pragma mark - Default Media Constraints @@ -48,13 +47,13 @@ - (void)setVideoEffectProcessor:(VideoEffectProcessor *)videoEffectProcessor */ + (NSDictionary *)commonOptionalConstraints { return @{ - @"DtlsSrtpKeyAgreement": kRTCMediaConstraintsValueTrue, - @"googAutoGainControl": kRTCMediaConstraintsValueTrue, - @"googNoiseSuppression": kRTCMediaConstraintsValueTrue, - @"googEchoCancellation": kRTCMediaConstraintsValueTrue, - @"googHighpassFilter": kRTCMediaConstraintsValueTrue, - @"googTypingNoiseDetection": kRTCMediaConstraintsValueTrue, - @"googAudioMirroring": kRTCMediaConstraintsValueFalse + @"DtlsSrtpKeyAgreement" : kRTCMediaConstraintsValueTrue, + @"googAutoGainControl" : kRTCMediaConstraintsValueTrue, + @"googNoiseSuppression" : kRTCMediaConstraintsValueTrue, + @"googEchoCancellation" : kRTCMediaConstraintsValueTrue, + @"googHighpassFilter" : kRTCMediaConstraintsValueTrue, + @"googTypingNoiseDetection" : kRTCMediaConstraintsValueTrue, + @"googAudioMirroring" : kRTCMediaConstraintsValueFalse }; } @@ -154,8 +153,8 @@ - (NSArray *)createMediaStream:(NSArray *)tracks { } } else if ([track.kind isEqualToString:@"audio"]) { settings = @{ - @"deviceId": @"audio", - @"groupId": @"", + @"deviceId" : @"audio", + @"groupId" : @"", }; } @@ -185,15 +184,23 @@ - (RTCVideoTrack *)createVideoTrack:(NSDictionary *)constraints { NSString *trackUUID = [[NSUUID UUID] UUIDString]; RTCVideoTrack *videoTrack = [self.peerConnectionFactory videoTrackWithSource:videoSource trackId:trackUUID]; -#if !TARGET_IPHONE_SIMULATOR - RTCCameraVideoCapturer *videoCapturer = [[RTCCameraVideoCapturer alloc] initWithDelegate:videoSource]; - VideoCaptureController *videoCaptureController = - [[VideoCaptureController alloc] initWithCapturer:videoCapturer andConstraints:constraints[@"video"]]; - videoCaptureController.enableMultitaskingCameraAccess = [WebRTCModuleOptions sharedInstance].enableMultitaskingCameraAccess; - videoTrack.captureController = videoCaptureController; - [videoCaptureController startCapture]; + BOOL hasRuntimeVideoDevice = YES; +#if TARGET_IPHONE_SIMULATOR + // On simulator, a runtime-provided video source may exist (e.g. virtual camera), + // so only skip camera capture setup when no runtime video device is available. + hasRuntimeVideoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] != nil; #endif + if (hasRuntimeVideoDevice) { + RTCCameraVideoCapturer *videoCapturer = [[RTCCameraVideoCapturer alloc] initWithDelegate:videoSource]; + VideoCaptureController *videoCaptureController = + [[VideoCaptureController alloc] initWithCapturer:videoCapturer andConstraints:constraints[@"video"]]; + videoCaptureController.enableMultitaskingCameraAccess = + [WebRTCModuleOptions sharedInstance].enableMultitaskingCameraAccess; + videoTrack.captureController = videoCaptureController; + [videoCaptureController startCapture]; + } + // Add dimension detection for local video tracks immediately [self addLocalVideoTrackDimensionDetection:videoTrack]; @@ -244,7 +251,10 @@ - (RTCVideoTrack *)createScreenCaptureVideoTrack { return videoTrack; } -RCT_EXPORT_METHOD(getDisplayMedia : (RCTPromiseResolveBlock)resolve rejecter : (RCTPromiseRejectBlock)reject) { +RCT_EXPORT_METHOD(getDisplayMedia + : (NSDictionary *)constraints resolver + : (RCTPromiseResolveBlock)resolve rejecter + : (RCTPromiseRejectBlock)reject) { #if TARGET_OS_TV reject(@"unsupported_platform", @"tvOS is not supported", nil); return; @@ -338,8 +348,8 @@ - (RTCVideoTrack *)createScreenCaptureVideoTrack { } } else if ([track.kind isEqualToString:@"audio"]) { settings = @{ - @"deviceId": @"audio", - @"groupId": @"", + @"deviceId" : @"audio", + @"groupId" : @"", }; } @@ -366,7 +376,14 @@ - (RTCVideoTrack *)createScreenCaptureVideoTrack { #else NSMutableArray *devices = [NSMutableArray array]; NSMutableArray *deviceTypes = [NSMutableArray array]; - [deviceTypes addObjectsFromArray:@[ AVCaptureDeviceTypeBuiltInWideAngleCamera, AVCaptureDeviceTypeBuiltInUltraWideCamera, AVCaptureDeviceTypeBuiltInTelephotoCamera, AVCaptureDeviceTypeBuiltInDualCamera, AVCaptureDeviceTypeBuiltInDualWideCamera, AVCaptureDeviceTypeBuiltInTripleCamera]]; + [deviceTypes addObjectsFromArray:@[ + AVCaptureDeviceTypeBuiltInWideAngleCamera, + AVCaptureDeviceTypeBuiltInUltraWideCamera, + AVCaptureDeviceTypeBuiltInTelephotoCamera, + AVCaptureDeviceTypeBuiltInDualCamera, + AVCaptureDeviceTypeBuiltInDualWideCamera, + AVCaptureDeviceTypeBuiltInTripleCamera + ]]; if (@available(macos 14.0, ios 17.0, tvos 17.0, *)) { [deviceTypes addObject:AVCaptureDeviceTypeExternal]; } @@ -388,7 +405,7 @@ - (RTCVideoTrack *)createScreenCaptureVideoTrack { if (device.localizedName != nil) { label = device.localizedName; } - + [devices addObject:@{ @"facing" : position, @"deviceId" : device.uniqueID, @@ -427,18 +444,19 @@ - (void)addLocalVideoTrackDimensionDetection:(RTCVideoTrack *)videoTrack { if (!videoTrack) { return; } - + // Create a dimension detector for this local track - VideoDimensionDetector *detector = [[VideoDimensionDetector alloc] initWith:@(-1) // -1 for local tracks + VideoDimensionDetector *detector = [[VideoDimensionDetector alloc] initWith:@(-1) // -1 for local tracks trackId:videoTrack.trackId webRTCModule:self]; - + // Store the detector using associated objects on the track itself - objc_setAssociatedObject(videoTrack, @selector(addLocalVideoTrackDimensionDetection:), detector, OBJC_ASSOCIATION_RETAIN_NONATOMIC); - + objc_setAssociatedObject( + videoTrack, @selector(addLocalVideoTrackDimensionDetection:), detector, OBJC_ASSOCIATION_RETAIN_NONATOMIC); + // Add the detector as a renderer to the track [videoTrack addRenderer:detector]; - + RCTLogTrace(@"[VideoTrackAdapter] Local dimension detector created for track %@", videoTrack.trackId); } @@ -446,14 +464,16 @@ - (void)removeLocalVideoTrackDimensionDetection:(RTCVideoTrack *)videoTrack { if (!videoTrack) { return; } - + // Get the associated detector - VideoDimensionDetector *detector = objc_getAssociatedObject(videoTrack, @selector(addLocalVideoTrackDimensionDetection:)); - + VideoDimensionDetector *detector = + objc_getAssociatedObject(videoTrack, @selector(addLocalVideoTrackDimensionDetection:)); + if (detector) { [videoTrack removeRenderer:detector]; [detector dispose]; - objc_setAssociatedObject(videoTrack, @selector(addLocalVideoTrackDimensionDetection:), nil, OBJC_ASSOCIATION_RETAIN_NONATOMIC); + objc_setAssociatedObject( + videoTrack, @selector(addLocalVideoTrackDimensionDetection:), nil, OBJC_ASSOCIATION_RETAIN_NONATOMIC); RCTLogTrace(@"[VideoTrackAdapter] Local dimension detector removed for track %@", videoTrack.trackId); } } @@ -546,10 +566,11 @@ - (void)removeLocalVideoTrackDimensionDetection:(RTCVideoTrack *)videoTrack { RTCAudioTrack *audioTrack = [self.peerConnectionFactory audioTrackWithTrackId:trackUUID]; audioTrack.isEnabled = originalTrack.isEnabled; [self.localTracks setObject:audioTrack forKey:trackUUID]; - for (NSString* streamId in [self.localStreams allKeys]) { - RTCMediaStream* stream = [self.localStreams objectForKey:streamId]; - if (stream == nil) continue; - for (RTCAudioTrack* track in [stream.audioTracks copy]) { + for (NSString *streamId in [self.localStreams allKeys]) { + RTCMediaStream *stream = [self.localStreams objectForKey:streamId]; + if (stream == nil) + continue; + for (RTCAudioTrack *track in [stream.audioTracks copy]) { if ([trackID isEqualToString:track.trackId]) { [stream addAudioTrack:audioTrack]; } @@ -560,15 +581,16 @@ - (void)removeLocalVideoTrackDimensionDetection:(RTCVideoTrack *)videoTrack { RTCVideoSource *videoSource = originalVideoTrack.source; RTCVideoTrack *videoTrack = [self.peerConnectionFactory videoTrackWithSource:videoSource trackId:trackUUID]; videoTrack.isEnabled = originalTrack.isEnabled; - + // Add dimension detection for cloned local video tracks [self addLocalVideoTrackDimensionDetection:videoTrack]; - + [self.localTracks setObject:videoTrack forKey:trackUUID]; - for (NSString* streamId in [self.localStreams allKeys]) { - RTCMediaStream* stream = [self.localStreams objectForKey:streamId]; - if (stream == nil) continue; - for (RTCVideoTrack* track in [stream.videoTracks copy]) { + for (NSString *streamId in [self.localStreams allKeys]) { + RTCMediaStream *stream = [self.localStreams objectForKey:streamId]; + if (stream == nil) + continue; + for (RTCVideoTrack *track in [stream.videoTracks copy]) { if ([trackID isEqualToString:track.trackId]) { [stream addVideoTrack:videoTrack]; } @@ -600,7 +622,11 @@ - (void)removeLocalVideoTrackDimensionDetection:(RTCVideoTrack *)videoTrack { #endif } -RCT_EXPORT_METHOD(mediaStreamTrackApplyConstraints : (nonnull NSString *)trackID : (NSDictionary *)constraints : (RCTPromiseResolveBlock)resolve : (RCTPromiseRejectBlock)reject) { +RCT_EXPORT_METHOD(mediaStreamTrackApplyConstraints + : (nonnull NSString *)trackID + : (NSDictionary *)constraints + : (RCTPromiseResolveBlock)resolve + : (RCTPromiseRejectBlock)reject) { #if TARGET_OS_TV reject(@"unsupported_platform", @"tvOS is not supported", nil); return; @@ -611,7 +637,7 @@ - (void)removeLocalVideoTrackDimensionDetection:(RTCVideoTrack *)videoTrack { RTCVideoTrack *videoTrack = (RTCVideoTrack *)track; if ([videoTrack.captureController isKindOfClass:[CaptureController class]]) { CaptureController *vcc = (CaptureController *)videoTrack.captureController; - NSError* error = nil; + NSError *error = nil; [vcc applyConstraints:constraints error:&error]; if (error) { reject(@"E_INVALID", error.localizedDescription, error); @@ -638,29 +664,30 @@ - (void)removeLocalVideoTrackDimensionDetection:(RTCVideoTrack *)videoTrack { } } -RCT_EXPORT_METHOD(mediaStreamTrackSetVideoEffects:(nonnull NSString *)trackID names:(nonnull NSArray *)names) -{ - RTCMediaStreamTrack *track = self.localTracks[trackID]; - if (track) { - RTCVideoTrack *videoTrack = (RTCVideoTrack *)track; - RTCVideoSource *videoSource = videoTrack.source; - - NSMutableArray *processors = [[NSMutableArray alloc] init]; - for (NSString *name in names) { - NSObject *processor = [ProcessorProvider getProcessor:name]; - if (processor != nil) { - [processors addObject:processor]; - } - } - - self.videoEffectProcessor = [[VideoEffectProcessor alloc] initWithProcessors:processors - videoSource:videoSource]; - - VideoCaptureController *vcc = (VideoCaptureController *)videoTrack.captureController; - RTCVideoCapturer *capturer = vcc.capturer; - - capturer.delegate = self.videoEffectProcessor; - } +RCT_EXPORT_METHOD(mediaStreamTrackSetVideoEffects + : (nonnull NSString *)trackID names + : (nonnull NSArray *)names) { + RTCMediaStreamTrack *track = self.localTracks[trackID]; + if (track) { + RTCVideoTrack *videoTrack = (RTCVideoTrack *)track; + RTCVideoSource *videoSource = videoTrack.source; + + NSMutableArray *processors = [[NSMutableArray alloc] init]; + for (NSString *name in names) { + NSObject *processor = [ProcessorProvider getProcessor:name]; + if (processor != nil) { + [processors addObject:processor]; + } + } + + self.videoEffectProcessor = [[VideoEffectProcessor alloc] initWithProcessors:processors + videoSource:videoSource]; + + VideoCaptureController *vcc = (VideoCaptureController *)videoTrack.captureController; + RTCVideoCapturer *capturer = vcc.capturer; + + capturer.delegate = self.videoEffectProcessor; + } } #pragma mark - Helpers @@ -679,30 +706,29 @@ - (RTCMediaStreamTrack *)trackForId:(nonnull NSString *)trackId pcId:(nonnull NS } - (void)ensureAudioSessionWithRecording { - RTCAudioSession* session = [RTCAudioSession sharedInstance]; - - // we also need to set default WebRTC audio configuration, since it may be activated after - // this method is called - RTCAudioSessionConfiguration* config = [RTCAudioSessionConfiguration webRTCConfiguration]; - // require audio session to be either PlayAndRecord or MultiRoute - if (session.category != AVAudioSessionCategoryPlayAndRecord) { - [session lockForConfiguration]; - config.category = AVAudioSessionCategoryPlayAndRecord; - config.categoryOptions = - AVAudioSessionCategoryOptionAllowBluetooth| - AVAudioSessionCategoryOptionDefaultToSpeaker; - config.mode = AVAudioSessionModeVideoChat; - NSError* error = nil; - bool success = [session setCategory:config.category withOptions:config.categoryOptions error:&error]; - if (!success) { - NSLog(@"ensureAudioSessionWithRecording: setCategory failed due to: %@", [error localizedDescription]); - } - success = [session setMode:config.mode error:&error]; - if (!success) { - NSLog(@"ensureAudioSessionWithRecording: Error setting category: %@", [error localizedDescription]); - } - [session unlockForConfiguration]; - } + RTCAudioSession *session = [RTCAudioSession sharedInstance]; + + // we also need to set default WebRTC audio configuration, since it may be activated after + // this method is called + RTCAudioSessionConfiguration *config = [RTCAudioSessionConfiguration webRTCConfiguration]; + // require audio session to be either PlayAndRecord or MultiRoute + if (session.category != AVAudioSessionCategoryPlayAndRecord) { + [session lockForConfiguration]; + config.category = AVAudioSessionCategoryPlayAndRecord; + config.categoryOptions = + AVAudioSessionCategoryOptionAllowBluetooth | AVAudioSessionCategoryOptionDefaultToSpeaker; + config.mode = AVAudioSessionModeVideoChat; + NSError *error = nil; + bool success = [session setCategory:config.category withOptions:config.categoryOptions error:&error]; + if (!success) { + NSLog(@"ensureAudioSessionWithRecording: setCategory failed due to: %@", [error localizedDescription]); + } + success = [session setMode:config.mode error:&error]; + if (!success) { + NSLog(@"ensureAudioSessionWithRecording: Error setting category: %@", [error localizedDescription]); + } + [session unlockForConfiguration]; + } } @end diff --git a/ios/RCTWebRTC/WebRTCModule+RTCPeerConnection.h b/ios/RCTWebRTC/WebRTCModule+RTCPeerConnection.h index aa4c7aa79..0ad9ec0ad 100644 --- a/ios/RCTWebRTC/WebRTCModule+RTCPeerConnection.h +++ b/ios/RCTWebRTC/WebRTCModule+RTCPeerConnection.h @@ -14,4 +14,13 @@ @interface WebRTCModule (RTCPeerConnection) ++ (RTCCertificate *)getCertificate:(NSString *)certId; + +- (nullable RTCRtpSender *)getSenderByPeerConnectionId:(nonnull NSNumber *)peerConnectionId + senderId:(nonnull NSString *)senderId; +- (nullable RTCRtpReceiver *)getReceiverByPeerConnectionId:(nonnull NSNumber *)peerConnectionId + receiverId:(nonnull NSString *)receiverId; +- (nullable RTCRtpTransceiver *)getTransceiverByPeerConnectionId:(nonnull NSNumber *)peerConnectionId + transceiverId:(nonnull NSString *)transceiverId; + @end diff --git a/ios/RCTWebRTC/WebRTCModule+RTCPeerConnection.m b/ios/RCTWebRTC/WebRTCModule+RTCPeerConnection.m index 379ab295f..38fd42279 100644 --- a/ios/RCTWebRTC/WebRTCModule+RTCPeerConnection.m +++ b/ios/RCTWebRTC/WebRTCModule+RTCPeerConnection.m @@ -1,10 +1,12 @@ #import +#import #import #import #import #import +#import #import #import #import @@ -20,6 +22,7 @@ #import "WebRTCModule+RTCPeerConnection.h" #import "WebRTCModule+VideoTrackAdapter.h" #import "WebRTCModule.h" +#import "WebRTCModuleOptions.h" @implementation RTCPeerConnection (React) @@ -65,10 +68,77 @@ - (void)setWebRTCModule:(id)webRTCModule { @end +static NSMutableDictionary *gCertificates = nil; + @implementation WebRTCModule (RTCPeerConnection) ++ (void)initialize { + if (self == [WebRTCModule class]) { + gCertificates = [NSMutableDictionary new]; + } +} + ++ (RTCCertificate *)getCertificate:(NSString *)certId { + if (!gCertificates) { + return nil; + } + return gCertificates[certId]; +} + int _transceiverNextId = 0; +- (nullable RTCRtpSender *)getSenderByPeerConnectionId:(nonnull NSNumber *)peerConnectionId + senderId:(nonnull NSString *)senderId { + RTCPeerConnection *peerConnection = self.peerConnections[peerConnectionId]; + if (!peerConnection) { + RCTLogWarn(@"PeerConnection %@ not found", peerConnectionId); + return nil; + } + RTCRtpSender *sender = nil; + for (RTCRtpSender *s in peerConnection.senders) { + if ([senderId isEqual:s.senderId]) { + sender = s; + break; + } + } + + return sender; +} +- (nullable RTCRtpReceiver *)getReceiverByPeerConnectionId:(nonnull NSNumber *)peerConnectionId + receiverId:(nonnull NSString *)receiverId { + RTCPeerConnection *peerConnection = self.peerConnections[peerConnectionId]; + if (!peerConnection) { + RCTLogWarn(@"PeerConnection %@ not found", peerConnectionId); + return nil; + } + RTCRtpReceiver *receiver = nil; + for (RTCRtpReceiver *r in peerConnection.receivers) { + if ([receiverId isEqual:r.receiverId]) { + receiver = r; + break; + } + } + + return receiver; +} + +- (nullable RTCRtpTransceiver *)getTransceiverByPeerConnectionId:(nonnull NSNumber *)peerConnectionId + transceiverId:(nonnull NSString *)transceiverId { + RTCPeerConnection *peerConnection = self.peerConnections[peerConnectionId]; + if (!peerConnection) { + RCTLogWarn(@"PeerConnection %@ not found", peerConnectionId); + return nil; + } + RTCRtpTransceiver *transceiver = nil; + for (RTCRtpTransceiver *t in peerConnection.transceivers) { + if ([transceiverId isEqual:t.sender.senderId]) { + transceiver = t; + break; + } + } + + return transceiver; +} /* * This method is synchronous and blocking. This is done so we can implement createDataChannel * in the same way (synchronous) since the peer connection needs to exist before. @@ -465,13 +535,13 @@ @implementation WebRTCModule (RTCPeerConnection) NSArray *streamIds = [options objectForKey:@"streamIds"]; RTCRtpSender *sender = [peerConnection addTrack:track streamIds:streamIds]; - + // Add mute detection for local video tracks (dimension detection handled at track creation) if (track.kind == kRTCMediaStreamTrackKindVideo) { RTCVideoTrack *videoTrack = (RTCVideoTrack *)track; [peerConnection addVideoTrackAdapter:videoTrack]; } - + RTCRtpTransceiver *transceiver = nil; for (RTCRtpTransceiver *t in peerConnection.transceivers) { @@ -535,7 +605,7 @@ @implementation WebRTCModule (RTCPeerConnection) RTCRtpTransceiverInit *transceiverInit = [SerializeUtils parseTransceiverOptions:initOptions]; transceiver = [peerConnection addTransceiverWithTrack:track init:transceiverInit]; - + // Add mute detection for local video tracks (dimension detection handled at track creation) if (track && track.kind == kRTCMediaStreamTrackKindVideo && self.localTracks[trackId]) { RTCVideoTrack *videoTrack = (RTCVideoTrack *)track; @@ -909,6 +979,10 @@ - (void)peerConnection:(RTC_OBJC_TYPE(RTCPeerConnection) *)peerConnection RTCVideoTrack *videoTrack = (RTCVideoTrack *)track; [peerConnection addVideoTrackAdapter:videoTrack]; [peerConnection addVideoDimensionDetector:videoTrack]; + } else if (track.kind == kRTCMediaStreamTrackKindAudio) { + RTCAudioTrack *audioTrack = (RTCAudioTrack *)track; + WebRTCModuleOptions *options = [WebRTCModuleOptions sharedInstance]; + audioTrack.source.volume = options.defaultTrackVolume; } peerConnection.remoteTracks[track.trackId] = track; @@ -976,4 +1050,84 @@ - (void)peerConnection:(nonnull RTCPeerConnection *)peerConnection didRemoveStre // Unused in Unified Plan. } +RCT_EXPORT_METHOD(generateCertificate + : (NSDictionary *)options resolver + : (RCTPromiseResolveBlock)resolve rejecter + : (RCTPromiseRejectBlock)reject) { + NSString *keyType = @"ECDSA"; + if (options[@"keyType"]) { + NSString *type = options[@"keyType"]; + if ([type isEqualToString:@"RSA"] || [type isEqualToString:@"ECDSA"]) { + keyType = type; + } + } + + NSMutableDictionary *params = [NSMutableDictionary new]; + if ([keyType isEqualToString:@"RSA"]) { + params[@"name"] = @"RSASSA-PKCS1-v1_5"; + } else { + params[@"name"] = @"ECDSA"; + params[@"namedCurve"] = @"P-256"; + } + + if (options[@"expires"]) { + params[@"expires"] = options[@"expires"]; + } else { + params[@"expires"] = @(2592000); // 30 days + } + + RTCCertificate *cert = [RTCCertificate generateCertificateWithParams:params]; + + if (!cert) { + reject(@"E_GEN_CERT_FAILED", @"Failed to generate certificate", nil); + return; + } + + NSString *certId = [[NSUUID UUID] UUIDString]; + gCertificates[certId] = cert; + + NSMutableDictionary *result = [NSMutableDictionary new]; + result[@"certificateId"] = certId; + + // expires + NSDate *now = [NSDate date]; + NSTimeInterval expiresSeconds = [params[@"expires"] doubleValue]; + result[@"expires"] = @((long)(([now timeIntervalSince1970] + expiresSeconds) * 1000)); + + // Fingerprints + NSMutableArray *fingerprints = [NSMutableArray new]; + NSString *pem = cert.certificate; + + // Calculate SHA-256 fingerprint + NSRange start = [pem rangeOfString:@"-----BEGIN CERTIFICATE-----"]; + NSRange end = [pem rangeOfString:@"-----END CERTIFICATE-----"]; + if (start.location != NSNotFound && end.location != NSNotFound) { + NSUInteger actualStart = start.location + start.length; + NSString *base64 = [pem substringWithRange:NSMakeRange(actualStart, end.location - actualStart)]; + // Remove newlines + base64 = [[base64 componentsSeparatedByCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]] + componentsJoinedByString:@""]; + + NSData *der = [[NSData alloc] initWithBase64EncodedString:base64 options:0]; + if (der) { + unsigned char digest[CC_SHA256_DIGEST_LENGTH]; + CC_SHA256(der.bytes, (CC_LONG)der.length, digest); + + NSMutableString *fingerprint = [NSMutableString stringWithCapacity:CC_SHA256_DIGEST_LENGTH * 3]; + for (int i = 0; i < CC_SHA256_DIGEST_LENGTH; ++i) { + [fingerprint appendFormat:@"%02x:", digest[i]]; + } + if (fingerprint.length > 0) { + [fingerprint deleteCharactersInRange:NSMakeRange(fingerprint.length - 1, 1)]; + } + + [fingerprints addObject:@{@"algorithm" : @"sha-256", @"value" : fingerprint}]; + } + } + + result[@"fingerprints"] = fingerprints; + + resolve(result); +} + @end diff --git a/ios/RCTWebRTC/WebRTCModule+Transceivers.m b/ios/RCTWebRTC/WebRTCModule+Transceivers.m index bd93afca0..3d969859f 100644 --- a/ios/RCTWebRTC/WebRTCModule+Transceivers.m +++ b/ios/RCTWebRTC/WebRTCModule+Transceivers.m @@ -219,7 +219,7 @@ @implementation WebRTCModule (Transceivers) }]; } } - + // Convert JSON codec capabilities to the actual objects. // Codec preferences is order sensitive. NSMutableArray *codecsToSet = [NSMutableArray new]; @@ -254,6 +254,7 @@ - (RTCRtpParameters *)updateParametersWithOptions:(NSDictionary *)options params encoding.isActive = [encodingUpdate[@"active"] boolValue]; encoding.rid = encodingUpdate[@"rid"]; encoding.maxBitrateBps = encodingUpdate[@"maxBitrate"]; + encoding.minBitrateBps = encodingUpdate[@"minBitrate"]; encoding.maxFramerate = encodingUpdate[@"maxFramerate"]; encoding.scaleResolutionDownBy = encodingUpdate[@"scaleResolutionDownBy"]; } diff --git a/ios/RCTWebRTC/WebRTCModule+VideoTrackAdapter.m b/ios/RCTWebRTC/WebRTCModule+VideoTrackAdapter.m index 49ae665aa..ccb8c31bf 100644 --- a/ios/RCTWebRTC/WebRTCModule+VideoTrackAdapter.m +++ b/ios/RCTWebRTC/WebRTCModule+VideoTrackAdapter.m @@ -248,8 +248,8 @@ - (void)addVideoDimensionDetector:(RTCVideoTrack *)track { } VideoDimensionDetector *dimensionDetector = [[VideoDimensionDetector alloc] initWith:self.reactTag - trackId:trackId - webRTCModule:self.webRTCModule]; + trackId:trackId + webRTCModule:self.webRTCModule]; [self.videoDimensionDetectors setObject:dimensionDetector forKey:trackId]; [track addRenderer:dimensionDetector]; diff --git a/ios/RCTWebRTC/WebRTCModule.h b/ios/RCTWebRTC/WebRTCModule.h index ace2ba043..3c631e355 100644 --- a/ios/RCTWebRTC/WebRTCModule.h +++ b/ios/RCTWebRTC/WebRTCModule.h @@ -31,7 +31,8 @@ static NSString *const kEventAudioDeviceModuleEngineDidStop = @"audioDeviceModul static NSString *const kEventAudioDeviceModuleEngineDidDisable = @"audioDeviceModuleEngineDidDisable"; static NSString *const kEventAudioDeviceModuleEngineWillRelease = @"audioDeviceModuleEngineWillRelease"; static NSString *const kEventAudioDeviceModuleDevicesUpdated = @"audioDeviceModuleDevicesUpdated"; -static NSString *const kEventAudioDeviceModuleAudioProcessingStateUpdated = @"audioDeviceModuleAudioProcessingStateUpdated"; +static NSString *const kEventAudioDeviceModuleAudioProcessingStateUpdated = + @"audioDeviceModuleAudioProcessingStateUpdated"; @class AudioDeviceModule; @@ -48,8 +49,6 @@ static NSString *const kEventAudioDeviceModuleAudioProcessingStateUpdated = @"au @property(nonatomic, strong) NSMutableDictionary *localStreams; @property(nonatomic, strong) NSMutableDictionary *localTracks; -// TODO: FrameCryption is not supported by this SDK yet. These containers are -// retained so the native factory initialization keeps working unchanged. @property(nonatomic, strong) NSMutableDictionary *frameCryptors; @property(nonatomic, strong) NSMutableDictionary *keyProviders; @property(nonatomic, strong) NSMutableDictionary *dataPacketCryptors; diff --git a/ios/RCTWebRTC/WebRTCModule.m b/ios/RCTWebRTC/WebRTCModule.m index 5188a7496..26944dd72 100644 --- a/ios/RCTWebRTC/WebRTCModule.m +++ b/ios/RCTWebRTC/WebRTCModule.m @@ -74,7 +74,9 @@ - (instancetype)init { if (encoderFactory == nil) { RTCDefaultVideoEncoderFactory *videoEncoderFactory = [[RTCDefaultVideoEncoderFactory alloc] init]; - RTCVideoEncoderFactorySimulcast *simulcastVideoEncoderFactory = [[RTCVideoEncoderFactorySimulcast alloc] initWithPrimary:videoEncoderFactory fallback:videoEncoderFactory]; + RTCVideoEncoderFactorySimulcast *simulcastVideoEncoderFactory = + [[RTCVideoEncoderFactorySimulcast alloc] initWithPrimary:videoEncoderFactory + fallback:videoEncoderFactory]; encoderFactory = simulcastVideoEncoderFactory; } if (decoderFactory == nil) { @@ -89,17 +91,17 @@ - (instancetype)init { // Always ensure an audio processing module exists so screen share // audio mixing can use capturePostProcessingDelegate at runtime. if (audioProcessingModule == nil && audioDevice == nil) { - audioProcessingModule = [[RTCDefaultAudioProcessingModule alloc] - initWithConfig:nil - capturePostProcessingDelegate:nil - renderPreProcessingDelegate:nil]; + audioProcessingModule = [[RTCDefaultAudioProcessingModule alloc] initWithConfig:nil + capturePostProcessingDelegate:nil + renderPreProcessingDelegate:nil]; options.audioProcessingModule = audioProcessingModule; RCTLogInfo(@"Created default audio processing module for screen share audio mixing"); } if (audioProcessingModule != nil) { if (audioDevice != nil) { - NSLog(@"Both audioProcessingModule and audioDevice are provided, but only one can be used. Ignoring audioDevice."); + NSLog(@"Both audioProcessingModule and audioDevice are provided, but only one can be used. Ignoring " + @"audioDevice."); } RCTLogInfo(@"Using audio processing module: %@", NSStringFromClass([audioProcessingModule class])); @@ -131,7 +133,6 @@ - (instancetype)init { _localStreams = [NSMutableDictionary new]; _localTracks = [NSMutableDictionary new]; - // TODO: FrameCryption is not supported yet; dictionaries left empty. _frameCryptors = [NSMutableDictionary new]; _keyProviders = [NSMutableDictionary new]; _dataPacketCryptors = [NSMutableDictionary new]; diff --git a/ios/RCTWebRTC/WebRTCModuleOptions.h b/ios/RCTWebRTC/WebRTCModuleOptions.h index b363cc4ff..c5f599f79 100644 --- a/ios/RCTWebRTC/WebRTCModuleOptions.h +++ b/ios/RCTWebRTC/WebRTCModuleOptions.h @@ -26,6 +26,8 @@ NS_ASSUME_NONNULL_BEGIN /// `createScreenCaptureVideoTrack` when in-app mode is used. @property(nonatomic, weak, nullable) InAppScreenCapturer *activeInAppScreenCapturer; +@property(nonatomic, assign) double defaultTrackVolume; + #pragma mark - This class is a singleton + (instancetype _Nonnull)sharedInstance; diff --git a/ios/RCTWebRTC/WebRTCModuleOptions.m b/ios/RCTWebRTC/WebRTCModuleOptions.m index ba108da6e..7dcb85d41 100644 --- a/ios/RCTWebRTC/WebRTCModuleOptions.m +++ b/ios/RCTWebRTC/WebRTCModuleOptions.m @@ -22,6 +22,7 @@ - (instancetype)init { self.videoEncoderFactory = nil; self.videoDecoderFactory = nil; self.loggingSeverity = RTCLoggingSeverityNone; + self.defaultTrackVolume = 1.0; } return self; diff --git a/ios/RCTWebRTC/videoEffects/ProcessorProvider.h b/ios/RCTWebRTC/videoEffects/ProcessorProvider.h index 741c0c75b..144ba38fb 100644 --- a/ios/RCTWebRTC/videoEffects/ProcessorProvider.h +++ b/ios/RCTWebRTC/videoEffects/ProcessorProvider.h @@ -3,8 +3,7 @@ @interface ProcessorProvider : NSObject + (NSObject *)getProcessor:(NSString *)name; -+ (void)addProcessor:(NSObject *)processor - forName:(NSString *)name; ++ (void)addProcessor:(NSObject *)processor forName:(NSString *)name; + (void)removeProcessor:(NSString *)name; @end diff --git a/ios/RCTWebRTC/videoEffects/ProcessorProvider.m b/ios/RCTWebRTC/videoEffects/ProcessorProvider.m index 78f441420..cd00ecdd2 100644 --- a/ios/RCTWebRTC/videoEffects/ProcessorProvider.m +++ b/ios/RCTWebRTC/videoEffects/ProcessorProvider.m @@ -5,20 +5,19 @@ @implementation ProcessorProvider static NSMutableDictionary *> *processorMap; + (void)initialize { - processorMap = [[NSMutableDictionary alloc] init]; + processorMap = [[NSMutableDictionary alloc] init]; } + (NSObject *)getProcessor:(NSString *)name { - return [processorMap objectForKey:name]; + return [processorMap objectForKey:name]; } -+ (void)addProcessor:(NSObject *)processor - forName:(NSString *)name { - [processorMap setObject:processor forKey:name]; ++ (void)addProcessor:(NSObject *)processor forName:(NSString *)name { + [processorMap setObject:processor forKey:name]; } + (void)removeProcessor:(NSString *)name { - [processorMap removeObjectForKey:name]; + [processorMap removeObjectForKey:name]; } @end diff --git a/ios/RCTWebRTC/videoEffects/VideoEffectProcessor.h b/ios/RCTWebRTC/videoEffects/VideoEffectProcessor.h index b46ea34c8..81aead38d 100644 --- a/ios/RCTWebRTC/videoEffects/VideoEffectProcessor.h +++ b/ios/RCTWebRTC/videoEffects/VideoEffectProcessor.h @@ -4,8 +4,8 @@ @interface VideoEffectProcessor : NSObject -@property (nonatomic, strong) NSArray *> *videoFrameProcessors; -@property (nonatomic, strong) RTCVideoSource *videoSource; +@property(nonatomic, strong) NSArray *> *videoFrameProcessors; +@property(nonatomic, strong) RTCVideoSource *videoSource; - (instancetype)initWithProcessors:(NSArray *> *)videoFrameProcessors videoSource:(RTCVideoSource *)videoSource; diff --git a/ios/RCTWebRTC/videoEffects/VideoEffectProcessor.m b/ios/RCTWebRTC/videoEffects/VideoEffectProcessor.m index a5da8d990..186e0ab59 100644 --- a/ios/RCTWebRTC/videoEffects/VideoEffectProcessor.m +++ b/ios/RCTWebRTC/videoEffects/VideoEffectProcessor.m @@ -6,18 +6,18 @@ @implementation VideoEffectProcessor - (instancetype)initWithProcessors:(NSArray *> *)videoFrameProcessors videoSource:(RTCVideoSource *)videoSource { - self = [super init]; - _videoFrameProcessors = videoFrameProcessors; - _videoSource = videoSource; - return self; + self = [super init]; + _videoFrameProcessors = videoFrameProcessors; + _videoSource = videoSource; + return self; } - (void)capturer:(nonnull RTCVideoCapturer *)capturer didCaptureVideoFrame:(nonnull RTCVideoFrame *)frame { - RTCVideoFrame *processedFrame = frame; - for (NSObject *processor in _videoFrameProcessors) { - processedFrame = [processor capturer:capturer didCaptureVideoFrame:processedFrame]; - } - [self.videoSource capturer:capturer didCaptureVideoFrame:processedFrame]; + RTCVideoFrame *processedFrame = frame; + for (NSObject *processor in _videoFrameProcessors) { + processedFrame = [processor capturer:capturer didCaptureVideoFrame:processedFrame]; + } + [self.videoSource capturer:capturer didCaptureVideoFrame:processedFrame]; } @end diff --git a/ios/RCTWebRTC/videoEffects/VideoFrameProcessor.h b/ios/RCTWebRTC/videoEffects/VideoFrameProcessor.h index f504b475a..b51b3f623 100644 --- a/ios/RCTWebRTC/videoEffects/VideoFrameProcessor.h +++ b/ios/RCTWebRTC/videoEffects/VideoFrameProcessor.h @@ -3,7 +3,6 @@ @protocol VideoFrameProcessorDelegate -- (RTCVideoFrame *)capturer:(RTCVideoCapturer *)capturer - didCaptureVideoFrame:(RTCVideoFrame *)frame; +- (RTCVideoFrame *)capturer:(RTCVideoCapturer *)capturer didCaptureVideoFrame:(RTCVideoFrame *)frame; @end diff --git a/package-lock.json b/package-lock.json index 8ef8e5ddd..810b62a32 100644 --- a/package-lock.json +++ b/package-lock.json @@ -1,17 +1,16 @@ { "name": "@stream-io/react-native-webrtc", - "version": "137.1.4-alpha.8", + "version": "137.1.4-alpha.10", "lockfileVersion": 2, "requires": true, "packages": { "": { "name": "@stream-io/react-native-webrtc", - "version": "137.1.4-alpha.8", + "version": "137.1.4-alpha.10", "license": "MIT", "dependencies": { "base64-js": "1.5.1", - "debug": "4.3.4", - "event-target-shim": "6.0.2" + "debug": "4.3.4" }, "devDependencies": { "@types/debug": "4.1.7", @@ -4477,17 +4476,6 @@ "node": ">= 0.6" } }, - "node_modules/event-target-shim": { - "version": "6.0.2", - "resolved": "https://registry.npmjs.org/event-target-shim/-/event-target-shim-6.0.2.tgz", - "integrity": "sha512-8q3LsZjRezbFZ2PN+uP+Q7pnHUMmAOziU2vA2OwoFaKIXxlxl38IylhSSgUorWu/rf4er67w0ikBqjBFk/pomA==", - "engines": { - "node": ">=10.13.0" - }, - "funding": { - "url": "https://github.com/sponsors/mysticatea" - } - }, "node_modules/execa": { "version": "5.1.1", "resolved": "https://registry.npmjs.org/execa/-/execa-5.1.1.tgz", @@ -11801,11 +11789,6 @@ "integrity": "sha512-aIL5Fx7mawVa300al2BnEE4iNvo1qETxLrPI/o05L7z6go7fCw1J6EQmbK4FmJ2AS7kgVF/KEZWufBfdClMcPg==", "dev": true }, - "event-target-shim": { - "version": "6.0.2", - "resolved": "https://registry.npmjs.org/event-target-shim/-/event-target-shim-6.0.2.tgz", - "integrity": "sha512-8q3LsZjRezbFZ2PN+uP+Q7pnHUMmAOziU2vA2OwoFaKIXxlxl38IylhSSgUorWu/rf4er67w0ikBqjBFk/pomA==" - }, "execa": { "version": "5.1.1", "resolved": "https://registry.npmjs.org/execa/-/execa-5.1.1.tgz", diff --git a/package.json b/package.json index 84608304c..3b8b27240 100644 --- a/package.json +++ b/package.json @@ -1,6 +1,6 @@ { "name": "@stream-io/react-native-webrtc", - "version": "137.1.4-alpha.8", + "version": "137.1.4-alpha.10", "repository": { "type": "git", "url": "git+https://github.com/GetStream/react-native-webrtc.git" @@ -21,8 +21,7 @@ "module": "lib/module/index.js", "dependencies": { "base64-js": "1.5.1", - "debug": "4.3.4", - "event-target-shim": "6.0.2" + "debug": "4.3.4" }, "peerDependencies": { "react-native": ">=0.73.0" diff --git a/src/EventEmitter.ts b/src/EventEmitter.ts index b21e49d47..939587d82 100644 --- a/src/EventEmitter.ts +++ b/src/EventEmitter.ts @@ -24,6 +24,7 @@ const NATIVE_EVENTS = [ 'mediaStreamTrackMuteChanged', 'videoTrackDimensionChanged', 'mediaStreamTrackEnded', + 'frameCryptionStateChanged', ]; const eventEmitter = new EventEmitter(); diff --git a/src/MediaDevices.ts b/src/MediaDevices.ts index cdb5c9542..9359ac4de 100644 --- a/src/MediaDevices.ts +++ b/src/MediaDevices.ts @@ -1,9 +1,9 @@ -import { EventTarget, Event, defineEventAttribute } from 'event-target-shim/index'; import { NativeModules } from 'react-native'; import { addListener } from './EventEmitter'; -import getDisplayMedia from './getDisplayMedia'; -import getUserMedia, { Constraints } from './getUserMedia'; +import getDisplayMedia, { Constraints as DisplayMediaConstraints } from './getDisplayMedia'; +import getUserMedia, { Constraints as UserMediaConstraints } from './getUserMedia'; +import { Event, EventTarget, getEventAttributeValue, setEventAttributeValue } from './vendor/event-target-shim'; const { WebRTCModule } = NativeModules; @@ -40,6 +40,14 @@ type MediaDevicesEventMap = { } class MediaDevices extends EventTarget { + get ondevicechange() { + return getEventAttributeValue(this, 'devicechange'); + } + + set ondevicechange(value) { + setEventAttributeValue(this, 'devicechange', value); + } + /** * W3C "Media Capture and Streams" compatible {@code enumerateDevices} * implementation. @@ -52,12 +60,13 @@ class MediaDevices extends EventTarget { * W3C "Screen Capture" compatible {@code getDisplayMedia} implementation. * See: https://w3c.github.io/mediacapture-screen-share/ * + * @param {*} constraints * @returns {Promise} */ - getDisplayMedia() { + getDisplayMedia(constraints: DisplayMediaConstraints) { ensureListeners(); - return getDisplayMedia(); + return getDisplayMedia(constraints); } /** @@ -68,19 +77,11 @@ class MediaDevices extends EventTarget { * @param {*} constraints * @returns {Promise} */ - getUserMedia(constraints: Constraints) { + getUserMedia(constraints: UserMediaConstraints) { ensureListeners(); return getUserMedia(constraints); } } -/** - * Define the `onxxx` event handlers. - */ -const proto = MediaDevices.prototype; - -defineEventAttribute(proto, 'devicechange'); - - export default new MediaDevices(); diff --git a/src/MediaStream.ts b/src/MediaStream.ts index 1c2437e47..32872b79a 100644 --- a/src/MediaStream.ts +++ b/src/MediaStream.ts @@ -1,9 +1,9 @@ -import { EventTarget, defineEventAttribute } from 'event-target-shim/index'; import { NativeModules } from 'react-native'; import MediaStreamTrack, { MediaStreamTrackInfo } from './MediaStreamTrack'; import MediaStreamTrackEvent from './MediaStreamTrackEvent'; import { uniqueID } from './RTCUtil'; +import { EventTarget, getEventAttributeValue, setEventAttributeValue } from './vendor/event-target-shim'; const { WebRTCModule } = NativeModules; @@ -81,6 +81,22 @@ export default class MediaStream extends EventTarget { } } + get onaddtrack() { + return getEventAttributeValue(this, 'addtrack'); + } + + set onaddtrack(value) { + setEventAttributeValue(this, 'addtrack', value); + } + + get onremovetrack() { + return getEventAttributeValue(this, 'removetrack'); + } + + set onremovetrack(value) { + setEventAttributeValue(this, 'removetrack', value); + } + get id(): string { return this._id; } @@ -151,11 +167,3 @@ export default class MediaStream extends EventTarget { WebRTCModule.mediaStreamRelease(this._reactTag); } } - -/** - * Define the `onxxx` event handlers. - */ -const proto = MediaStream.prototype; - -defineEventAttribute(proto, 'addtrack'); -defineEventAttribute(proto, 'removetrack'); diff --git a/src/MediaStreamTrack.ts b/src/MediaStreamTrack.ts index 6f8cd8141..78cc17d12 100644 --- a/src/MediaStreamTrack.ts +++ b/src/MediaStreamTrack.ts @@ -1,4 +1,3 @@ -import { EventTarget, Event, defineEventAttribute } from 'event-target-shim/index'; import { NativeModules } from 'react-native'; import { MediaTrackConstraints } from './Constraints'; @@ -6,6 +5,7 @@ import { addListener, removeListener } from './EventEmitter'; import Logger from './Logger'; import { videoTrackDimensionChangedEventQueue } from './MediaDevices'; import { deepClone, normalizeConstraints } from './RTCUtil'; +import { Event, EventTarget, getEventAttributeValue, setEventAttributeValue } from './vendor/event-target-shim'; const log = new Logger('pc'); const { WebRTCModule } = NativeModules; @@ -74,6 +74,30 @@ export default class MediaStreamTrack extends EventTarget { if (this.kind !== 'video') { - throw new Error('Only implemented for video tracks'); + log.info(`Only implemented for video tracks, ignoring applyConstraints for ${this.id}`); + + return; } const normalized = normalizeConstraints({ video: constraints ?? true }); @@ -304,12 +330,3 @@ export default class MediaStreamTrack extends EventTarget { _peerConnectionId: number; _reactTag: string; - _bufferedAmount: number; _id: number; _label: string; @@ -55,6 +54,54 @@ export default class RTCDataChannel extends EventTarget { this._registerEvents(); } + get onbufferedamountlow() { + return getEventAttributeValue(this, 'bufferedamountlow'); + } + + set onbufferedamountlow(value) { + setEventAttributeValue(this, 'bufferedamountlow', value); + } + + get onclose() { + return getEventAttributeValue(this, 'close'); + } + + set onclose(value) { + setEventAttributeValue(this, 'close', value); + } + + get onclosing() { + return getEventAttributeValue(this, 'closing'); + } + + set onclosing(value) { + setEventAttributeValue(this, 'closing', value); + } + + get onerror() { + return getEventAttributeValue(this, 'error'); + } + + set onerror(value) { + setEventAttributeValue(this, 'error', value); + } + + get onmessage() { + return getEventAttributeValue(this, 'message'); + } + + set onmessage(value) { + setEventAttributeValue(this, 'message', value); + } + + get onopen() { + return getEventAttributeValue(this, 'open'); + } + + set onopen(value) { + setEventAttributeValue(this, 'open', value); + } + get bufferedAmount(): number { return this._bufferedAmount; } @@ -176,15 +223,3 @@ export default class RTCDataChannel extends EventTarget { }); } } - -/** - * Define the `onxxx` event handlers. - */ -const proto = RTCDataChannel.prototype; - -defineEventAttribute(proto, 'bufferedamountlow'); -defineEventAttribute(proto, 'close'); -defineEventAttribute(proto, 'closing'); -defineEventAttribute(proto, 'error'); -defineEventAttribute(proto, 'message'); -defineEventAttribute(proto, 'open'); diff --git a/src/RTCDataChannelEvent.ts b/src/RTCDataChannelEvent.ts index 956d059b6..7bfbe99a1 100644 --- a/src/RTCDataChannelEvent.ts +++ b/src/RTCDataChannelEvent.ts @@ -1,6 +1,6 @@ -import { Event } from 'event-target-shim/index'; - import type RTCDataChannel from './RTCDataChannel'; +import { Event } from './vendor/event-target-shim'; + type DATA_CHANNEL_EVENTS = 'open'| 'message'| 'bufferedamountlow'| 'closing'| 'close'| 'error' | 'datachannel'; diff --git a/src/RTCDataPacketCryptor.ts b/src/RTCDataPacketCryptor.ts new file mode 100644 index 000000000..c70b67f8d --- /dev/null +++ b/src/RTCDataPacketCryptor.ts @@ -0,0 +1,90 @@ +import * as base64 from 'base64-js'; +import { NativeModules } from 'react-native'; + +import Logger from './Logger'; +const { WebRTCModule } = NativeModules; +const log = new Logger('pc'); + +export interface RTCEncryptedPacket { + payload: Uint8Array, + iv: Uint8Array, + keyIndex: number, +} + +export default class RTCDataPacketCryptor { + _id: string; + + constructor(dataPacketCryptorId: string) { + this._id = dataPacketCryptorId; + } + + async encrypt(participantId: string, keyIndex: number, data: Uint8Array): Promise { + const params = { + dataPacketCryptorId: this._id, + participantId, + keyIndex, + data: base64.fromByteArray(data) + }; + + const result = await WebRTCModule.dataPacketCryptorEncrypt(params); + + if (!result) { + log.info('encrypt: result null'); + + return null; + } + + if (result.payload === undefined) { + log.info('encrypt: payload null'); + + return null; + } + + if (result.iv === undefined) { + log.info('encrypt: iv null'); + + return null; + } + + if (result.keyIndex === undefined) { + log.info('encrypt: keyIndex null'); + + return null; + } + + return { + payload: base64.toByteArray(result.payload), + iv: base64.toByteArray(result.iv), + keyIndex: result.keyIndex + }; + } + + async decrypt(participantId: string, packet: RTCEncryptedPacket): Promise { + const params = { + dataPacketCryptorId: this._id, + participantId, + payload: base64.fromByteArray(packet.payload), + iv: base64.fromByteArray(packet.iv), + keyIndex: packet.keyIndex, + }; + + const result = await WebRTCModule.dataPacketCryptorDecrypt(params); + + if (!result) { + log.info('decrypt: result null'); + + return null; + } + + return base64.toByteArray(result.data); + } + + + async dispose() { + const params = { + dataPacketCryptorId: this._id, + }; + + await WebRTCModule.dataPacketCryptorDispose(params); + } +} diff --git a/src/RTCDataPacketCryptorFactory.ts b/src/RTCDataPacketCryptorFactory.ts new file mode 100644 index 000000000..84798227a --- /dev/null +++ b/src/RTCDataPacketCryptorFactory.ts @@ -0,0 +1,25 @@ +import { NativeModules } from 'react-native'; + +import RTCDataPacketCryptor from './RTCDataPacketCryptor'; +import RTCFrameCryptorAlgorithm from './RTCFrameCryptorFactory'; +import RTCKeyProvider from './RTCKeyProvider'; +const { WebRTCModule } = NativeModules; + +export default class RTCDataPacketCryptorFactory { + static async createDataPacketCryptor( + algorithm: RTCFrameCryptorAlgorithm, + keyProvider: RTCKeyProvider + ): Promise { + const params = { + 'algorithm': algorithm, + 'keyProviderId': keyProvider._id + }; + const result = await WebRTCModule.dataPacketCryptorFactoryCreateDataPacketCryptor(params); + + if (!result) { + throw new Error('Error when creating data packet cryptor for sender'); + } + + return new RTCDataPacketCryptor(result.dataPacketCryptorId); + } +} \ No newline at end of file diff --git a/src/RTCErrorEvent.ts b/src/RTCErrorEvent.ts index 6a0a28613..9240dec89 100644 --- a/src/RTCErrorEvent.ts +++ b/src/RTCErrorEvent.ts @@ -1,4 +1,4 @@ -import { Event } from 'event-target-shim/index'; +import { Event } from './vendor/event-target-shim'; type RTCPeerConnectionErrorFunc = | 'addTransceiver' diff --git a/src/RTCFrameCryptor.ts b/src/RTCFrameCryptor.ts new file mode 100644 index 000000000..14c0a9e2e --- /dev/null +++ b/src/RTCFrameCryptor.ts @@ -0,0 +1,166 @@ +import { NativeModules } from 'react-native'; + +import { addListener, removeListener } from './EventEmitter'; +import Logger from './Logger'; +import { Event, EventTarget, getEventAttributeValue, setEventAttributeValue } from './vendor/event-target-shim'; + +const { WebRTCModule } = NativeModules; + +const log = new Logger('pc'); + +type FRAME_CRYPTOR_EVENTS = 'framecryptorstatechanged'; + +interface IRTCDataChannelEventInitDict extends Event.EventInit { + frameCryptor: RTCFrameCryptor; + state: RTCFrameCryptorState; +} + +/** + * @eventClass + * This event is fired whenever the RTCDataChannel has changed in any way. + * @param {FRAME_CRYPTOR_EVENTS} type - The type of event. + * @param {IRTCDataChannelEventInitDict} eventInitDict - The event init properties. + * @see {@link https://developer.mozilla.org/en-US/docs/Web/API/RTCDataChannel#events MDN} for details. + */ +export class RTCFrameCryptorStateEvent< +TEventType extends FRAME_CRYPTOR_EVENTS +> extends Event { + /** @eventProperty */ + frameCryptor: RTCFrameCryptor; + /** @eventProperty */ + state: RTCFrameCryptorState; + constructor(type: TEventType, eventInitDict: IRTCDataChannelEventInitDict) { + super(type, eventInitDict); + this.frameCryptor = eventInitDict.frameCryptor; + this.state = eventInitDict.state; + } +} + +type RTCFrameCryptorEventMap = { + framecryptorstatechanged: RTCFrameCryptorStateEvent<'framecryptorstatechanged'>; +} + +export enum RTCFrameCryptorState { + FrameCryptorStateNew, + FrameCryptorStateOk, + FrameCryptorStateEncryptionFailed, + FrameCryptorStateDecryptionFailed, + FrameCryptorStateMissingKey, + FrameCryptorStateKeyRatcheted, + FrameCryptorStateInternalError, +} + +export default class RTCFrameCryptor extends EventTarget { + private _frameCryptorId: string; + private _participantId: string; + + get onframecryptorstatechanged() { + return getEventAttributeValue(this, 'framecryptorstatechanged'); + } + + set onframecryptorstatechanged(value) { + setEventAttributeValue(this, 'framecryptorstatechanged', value); + } + + constructor(frameCryptorId: string, participantId: string) { + super(); + this._frameCryptorId = frameCryptorId; + this._participantId = participantId; + this._registerEvents(); + } + + get id() { + return this._frameCryptorId; + } + + get participantId() { + return this._participantId; + } + + _cryptorStateFromString(str: string): RTCFrameCryptorState { + switch (str) { + case 'new': + return RTCFrameCryptorState.FrameCryptorStateNew; + case 'ok': + return RTCFrameCryptorState.FrameCryptorStateOk; + case 'decryptionFailed': + return RTCFrameCryptorState.FrameCryptorStateDecryptionFailed; + case 'encryptionFailed': + return RTCFrameCryptorState.FrameCryptorStateEncryptionFailed; + case 'internalError': + return RTCFrameCryptorState.FrameCryptorStateInternalError; + case 'keyRatcheted': + return RTCFrameCryptorState.FrameCryptorStateKeyRatcheted; + case 'missingKey': + return RTCFrameCryptorState.FrameCryptorStateMissingKey; + default: + throw 'Unknown FrameCryptorState: $str'; + } + } + + async setKeyIndex(keyIndex: number): Promise { + const params = { + frameCryptorId: this._frameCryptorId, + keyIndex, + }; + + return WebRTCModule.frameCryptorSetKeyIndex(params) + .then(data => data['result']); + } + + async getKeyIndex(): Promise { + const params = { + frameCryptorId: this._frameCryptorId, + }; + + return WebRTCModule.frameCryptorGetKeyIndex(params) + .then(data => data['keyIndex']); + } + + async setEnabled(enabled: boolean): Promise { + const params = { + frameCryptorId: this._frameCryptorId, + enabled, + }; + + return WebRTCModule.frameCryptorSetEnabled(params) + .then(data => data['result']); + } + + async getEnabled(): Promise { + const params = { + frameCryptorId: this._frameCryptorId, + }; + + return WebRTCModule.frameCryptorGetEnabled(params) + .then(data => data['enabled']); + } + + async dispose(): Promise { + const params = { + frameCryptorId: this._frameCryptorId, + }; + + await WebRTCModule.frameCryptorDispose(params); + removeListener(this); + } + + + _registerEvents(): void { + addListener(this, 'frameCryptionStateChanged', (ev: any) => { + if (ev.participantId !== this._participantId || ev.frameCryptorId !== this._frameCryptorId) { + return; + } + + log.debug(`${this.id} frameCryptionStateChanged ${ev.state}`); + + const initDict = { + frameCryptor: this, + state: ev.state, + }; + + this.dispatchEvent(new RTCFrameCryptorStateEvent('framecryptorstatechanged', initDict)); + }); + } +} + diff --git a/src/RTCFrameCryptorFactory.ts b/src/RTCFrameCryptorFactory.ts new file mode 100644 index 000000000..c80669260 --- /dev/null +++ b/src/RTCFrameCryptorFactory.ts @@ -0,0 +1,102 @@ +import * as base64 from 'base64-js'; +import { NativeModules } from 'react-native'; + +import RTCFrameCryptor from './RTCFrameCryptor'; +import RTCKeyProvider from './RTCKeyProvider'; +import RTCRtpReceiver from './RTCRtpReceiver'; +import RTCRtpSender from './RTCRtpSender'; +const { WebRTCModule } = NativeModules; + +export enum RTCFrameCryptorAlgorithm { + kAesGcm = 0, + // kAesCbc = 1, +} + +export type RTCKeyProviderOptions = { + sharedKey: boolean, + ratchetSalt: string | Uint8Array, + ratchetWindowSize: number, + uncryptedMagicBytes?: Uint8Array, + failureTolerance?: number, + keyRingSize?: number, + discardFrameWhenCryptorNotReady?: boolean +} + +export default class RTCFrameCryptorFactory { + static createFrameCryptorForRtpSender( + participantId: string, + sender: RTCRtpSender, + algorithm: RTCFrameCryptorAlgorithm, + keyProvider: RTCKeyProvider + ): RTCFrameCryptor { + const params = { + 'peerConnectionId': sender._peerConnectionId, + 'rtpSenderId': sender._id, + participantId, + 'keyProviderId': keyProvider._id, + 'type': 'sender', + 'algorithm': algorithm + }; + const result = WebRTCModule.frameCryptorFactoryCreateFrameCryptor(params); + + if (!result) { + throw new Error('Error when creating frame cryptor for sender'); + } + + return new RTCFrameCryptor(result, participantId); + } + static createFrameCryptorForRtpReceiver( + participantId: string, + receiver: RTCRtpReceiver, + algorithm: RTCFrameCryptorAlgorithm, + keyProvider: RTCKeyProvider + ): RTCFrameCryptor { + const params = { + 'peerConnectionId': receiver._peerConnectionId, + 'rtpReceiverId': receiver._id, + participantId, + 'keyProviderId': keyProvider._id, + 'type': 'receiver', + 'algorithm': algorithm + }; + const result = WebRTCModule.frameCryptorFactoryCreateFrameCryptor(params); + + if (!result) { + throw new Error('Error when creating frame cryptor for receiver'); + } + + return new RTCFrameCryptor(result, participantId); + } + + static createDefaultKeyProvider(options: RTCKeyProviderOptions): RTCKeyProvider { + const params = { + 'sharedKey': options.sharedKey, + 'ratchetWindowSize': options.ratchetWindowSize, + 'failureTolerance': options.failureTolerance ?? -1, + 'keyRingSize': options.keyRingSize ?? 16, + 'discardFrameWhenCryptorNotReady': options.discardFrameWhenCryptorNotReady ?? false + }; + + if (typeof options.ratchetSalt === 'string') { + params['ratchetSalt'] = options.ratchetSalt; + params['ratchetSaltIsBase64'] = false; + } else { + const bytes = options.ratchetSalt as Uint8Array; + + params['ratchetSalt'] = base64.fromByteArray(bytes); + params['ratchetSaltIsBase64'] = true; + } + + if (options.uncryptedMagicBytes) { + params['uncryptedMagicBytes'] = base64.fromByteArray(options.uncryptedMagicBytes); + } + + const result = WebRTCModule.frameCryptorFactoryCreateKeyProvider(params); + + if (!result) { + throw new Error('Error when creating key provider!'); + } + + return new RTCKeyProvider(result); + } +} \ No newline at end of file diff --git a/src/RTCIceCandidateEvent.ts b/src/RTCIceCandidateEvent.ts index af4d11a12..8118d0304 100644 --- a/src/RTCIceCandidateEvent.ts +++ b/src/RTCIceCandidateEvent.ts @@ -1,6 +1,6 @@ -import { Event } from 'event-target-shim/index'; - import type RTCIceCandidate from './RTCIceCandidate'; +import { Event } from './vendor/event-target-shim'; + type RTC_ICECANDIDATE_EVENTS = 'icecandidate' | 'icecandidateerror' diff --git a/src/RTCKeyProvider.ts b/src/RTCKeyProvider.ts new file mode 100644 index 000000000..b2395b080 --- /dev/null +++ b/src/RTCKeyProvider.ts @@ -0,0 +1,117 @@ +import * as base64 from 'base64-js'; +import { NativeModules } from 'react-native'; +const { WebRTCModule } = NativeModules; + +export enum FrameCryptorState { + FrameCryptorStateNew, + FrameCryptorStateOk, + FrameCryptorStateEncryptionFailed, + FrameCryptorStateDecryptionFailed, + FrameCryptorStateMissingKey, + FrameCryptorStateKeyRatcheted, + FrameCryptorStateInternalError, +} + +export default class RTCKeyProvider { + _id: string; + + constructor(keyProviderId: string) { + this._id = keyProviderId; + } + + async setSharedKey(key: string | Uint8Array, keyIndex = 0) { + const params = { + keyProviderId: this._id, + keyIndex, + }; + + if (typeof key === 'string') { + params['key'] = key; + params['keyIsBase64'] = false; + } else { + params['key'] = base64.fromByteArray(key as Uint8Array); + params['keyIsBase64'] = true; + } + + return WebRTCModule.keyProviderSetSharedKey(params) + .then(data => data['result']); + } + + async ratchetSharedKey(keyIndex = 0): Promise { + const params = { + keyProviderId: this._id, + keyIndex, + }; + + return WebRTCModule.keyProviderRatchetSharedKey(params) + .then(data => base64.toByteArray(data['result'])); + } + + async exportSharedKey(keyIndex = 0): Promise { + const params = { + keyProviderId: this._id, + keyIndex, + }; + + return WebRTCModule.keyProviderExportSharedKey(params) + .then(data => base64.toByteArray(data['result'])); + } + + async setKey(participantId: string, key: string | Uint8Array, keyIndex = 0): Promise { + const params = { + keyProviderId: this._id, + participantId, + keyIndex, + }; + + if (typeof key === 'string') { + params['key'] = key; + params['keyIsBase64'] = false; + } else { + params['key'] = base64.fromByteArray(key as Uint8Array); + params['keyIsBase64'] = true; + } + + return WebRTCModule.keyProviderSetKey(params) + .then(data => data['result']); + } + + async ratchetKey(participantId: string, keyIndex = 0): Promise { + const params = { + keyProviderId: this._id, + participantId, + keyIndex, + }; + + return WebRTCModule.keyProviderRatchetKey(params) + .then(data => base64.toByteArray(data['result'])); + } + + async exportKey(participantId: string, keyIndex = 0): Promise { + const params = { + keyProviderId: this._id, + participantId, + keyIndex, + }; + + return WebRTCModule.keyProviderExportKey(params) + .then(data => base64.toByteArray(data['result'])); + } + + async setSifTrailer(trailer: Uint8Array) { + const params = { + keyProviderId: this._id, + 'sifTrailer': base64.fromByteArray(trailer), + }; + + return WebRTCModule.keyProviderSetSifTrailer(params); + } + + async dispose() { + const params = { + keyProviderId: this._id, + }; + + return WebRTCModule.keyProviderDispose(params); + } +} diff --git a/src/RTCPeerConnection.ts b/src/RTCPeerConnection.ts index e2dc79111..30d3d7757 100644 --- a/src/RTCPeerConnection.ts +++ b/src/RTCPeerConnection.ts @@ -1,4 +1,3 @@ -import { EventTarget, Event, defineEventAttribute } from 'event-target-shim/index'; import { NativeModules } from 'react-native'; import { addListener, removeListener } from './EventEmitter'; @@ -6,6 +5,7 @@ import Logger from './Logger'; import MediaStream from './MediaStream'; import MediaStreamTrack from './MediaStreamTrack'; import MediaStreamTrackEvent from './MediaStreamTrackEvent'; +import RTCCertificate from './RTCCertificate'; import RTCDataChannel from './RTCDataChannel'; import RTCDataChannelEvent from './RTCDataChannelEvent'; import RTCIceCandidate from './RTCIceCandidate'; @@ -18,6 +18,8 @@ import RTCRtpTransceiver from './RTCRtpTransceiver'; import RTCSessionDescription, { RTCSessionDescriptionInit } from './RTCSessionDescription'; import RTCTrackEvent from './RTCTrackEvent'; import * as RTCUtil from './RTCUtil'; +import { RTCOfferOptions } from './RTCUtil'; +import { Event, EventTarget, getEventAttributeValue, setEventAttributeValue } from './vendor/event-target-shim'; const log = new Logger('pc'); const { WebRTCModule } = NativeModules; @@ -54,6 +56,7 @@ type RTCIceServer = { type RTCConfiguration = { bundlePolicy?: 'balanced' | 'max-compat' | 'max-bundle', + certificates?: RTCCertificate[], iceCandidatePoolSize?: number, iceServers?: RTCIceServer[], iceTransportPolicy?: 'all' | 'relay', @@ -88,6 +91,40 @@ export default class RTCPeerConnection extends EventTarget; _pendingTrackEvents: any[]; + static generateCertificate( + keygenAlgorithm: string | { + name: string, + namedCurve?: string, + modulusLength?: number, + publicExponent?: Uint8Array, + hash?: string + } + ): Promise { + const options: { keyType?: string, expires?: number } = {}; + + let algorithm = keygenAlgorithm; + + if (typeof algorithm === 'string') { + algorithm = { name: algorithm }; + } + + if (algorithm.name === 'RSASSA-PKCS1-v1_5') { + options.keyType = 'RSA'; + } else if (algorithm.name === 'ECDSA') { + options.keyType = 'ECDSA'; + } else { + // Default to ECDSA for other cases or if unspecified + // This behavior matches common expectations when modern defaults are preferred + if (algorithm.name && algorithm.name.toUpperCase().includes('RSA')) { + options.keyType = 'RSA'; + } else { + options.keyType = 'ECDSA'; + } + } + + return WebRTCModule.generateCertificate(options).then(info => new RTCCertificate(info)); + } + constructor(configuration?: RTCConfiguration) { super(); @@ -117,6 +154,16 @@ export default class RTCPeerConnection extends EventTarget s.urls); + + // Sanitize certificates. + if (configuration.certificates) { + // @ts-ignore + configuration.certificates = configuration.certificates.map(cert => { + return { + certificateId: cert._id + }; + }); + } } if (!WebRTCModule.peerConnectionInit(configuration, this._pcId)) { @@ -132,7 +179,87 @@ export default class RTCPeerConnection extends EventTarget { + if (this.connectionState === 'closed') { + throw new Error('Peer Connection is closed'); + } + log.debug(`${this._pcId} addIceCandidate`); if (!candidate || !candidate.candidate) { @@ -535,13 +666,11 @@ export default class RTCPeerConnection extends EventTarget !e.transceiver.stopped && e.transceiver.sender).filter(Boolean); + return this._transceivers.filter(e => !e.transceiver.stopped).map(e => e.transceiver.sender); } getReceivers(): RTCRtpReceiver[] { - // @ts-ignore - return this._transceivers.map(e => !e.transceiver.stopped && e.transceiver.receiver).filter(Boolean); + return this._transceivers.filter(e => !e.transceiver.stopped).map(e => e.transceiver.receiver); } close(): void { @@ -831,19 +960,3 @@ export default class RTCPeerConnection extends EventTarget a.order - b.order); } } - -/** - * Define the `onxxx` event handlers. - */ -const proto = RTCPeerConnection.prototype; - -defineEventAttribute(proto, 'connectionstatechange'); -defineEventAttribute(proto, 'icecandidate'); -defineEventAttribute(proto, 'icecandidateerror'); -defineEventAttribute(proto, 'iceconnectionstatechange'); -defineEventAttribute(proto, 'icegatheringstatechange'); -defineEventAttribute(proto, 'negotiationneeded'); -defineEventAttribute(proto, 'signalingstatechange'); -defineEventAttribute(proto, 'datachannel'); -defineEventAttribute(proto, 'track'); -defineEventAttribute(proto, 'error'); diff --git a/src/RTCRtpEncodingParameters.ts b/src/RTCRtpEncodingParameters.ts index 320b6b1b5..8c9186fcb 100644 --- a/src/RTCRtpEncodingParameters.ts +++ b/src/RTCRtpEncodingParameters.ts @@ -3,6 +3,7 @@ export interface RTCRtpEncodingParametersInit { rid?: string; maxFramerate?: number; maxBitrate?: number; + minBitrate?: number; scaleResolutionDownBy?: number; } @@ -11,12 +12,14 @@ export default class RTCRtpEncodingParameters { _rid: string | null; _maxFramerate: number | null; _maxBitrate: number | null; + _minBitrate: number | null; _scaleResolutionDownBy: number | null; constructor(init: RTCRtpEncodingParametersInit) { this.active = init.active; this._rid = init.rid ?? null; this._maxBitrate = init.maxBitrate ?? null; + this._minBitrate = init.minBitrate ?? null; this._maxFramerate = init.maxFramerate ?? null; this._scaleResolutionDownBy = init.scaleResolutionDownBy ?? null; } @@ -51,6 +54,19 @@ export default class RTCRtpEncodingParameters { } } + get minBitrate() { + return this._minBitrate; + } + + set minBitrate(bitrate) { + // eslint-disable-next-line eqeqeq + if (bitrate != null && bitrate >= 0) { + this._minBitrate = bitrate; + } else { + this._minBitrate = null; + } + } + get scaleResolutionDownBy() { return this._scaleResolutionDownBy; } @@ -77,6 +93,10 @@ export default class RTCRtpEncodingParameters { obj['maxBitrate'] = this._maxBitrate; } + if (this._minBitrate !== null) { + obj['minBitrate'] = this._minBitrate; + } + if (this._maxFramerate !== null) { obj['maxFramerate'] = this._maxFramerate; } diff --git a/src/RTCTrackEvent.ts b/src/RTCTrackEvent.ts index ea9bea40c..ec49b5949 100644 --- a/src/RTCTrackEvent.ts +++ b/src/RTCTrackEvent.ts @@ -1,9 +1,9 @@ -import { Event } from 'event-target-shim/index'; import MediaStream from './MediaStream'; import type MediaStreamTrack from './MediaStreamTrack'; import RTCRtpReceiver from './RTCRtpReceiver'; import RTCRtpTransceiver from './RTCRtpTransceiver'; +import { Event } from './vendor/event-target-shim'; type TRACK_EVENTS = 'track' diff --git a/src/RTCUtil.ts b/src/RTCUtil.ts index 417c57bfb..10e84990e 100644 --- a/src/RTCUtil.ts +++ b/src/RTCUtil.ts @@ -12,6 +12,13 @@ const FACING_MODES = [ 'user', 'environment' ]; const ASPECT_RATIO = 16 / 9; +export type RTCOfferOptions = { + iceRestart?:boolean; + offerToReceiveAudio?: boolean; + offerToReceiveVideo?: boolean; + voiceActivityDetection?:boolean +}; + const STANDARD_OFFER_OPTIONS = { icerestart: 'IceRestart', offertoreceiveaudio: 'OfferToReceiveAudio', @@ -157,10 +164,10 @@ export function isSdpTypeValid(type: string): boolean { * @param options - user supplied options * @return Normalized options */ -export function normalizeOfferOptions(options: object = {}): object { - const newOptions = {}; +export function normalizeOfferOptions(options?: RTCOfferOptions) { + const newOptions: Record = {}; - if (!options) { + if (!options || typeof options !== 'object') { return newOptions; } diff --git a/src/getDisplayMedia.ts b/src/getDisplayMedia.ts index c691b04bd..9ba45e0e7 100644 --- a/src/getDisplayMedia.ts +++ b/src/getDisplayMedia.ts @@ -6,9 +6,16 @@ import MediaStreamError from './MediaStreamError'; const { WebRTCModule } = NativeModules; -export default function getDisplayMedia(): Promise { +export interface Constraints { + android?: { + createConfigForDefaultDisplay?: boolean; + resolutionScale?: number; + } +} + +export default function getDisplayMedia(constraints: Constraints = {}): Promise { return new Promise((resolve, reject) => { - WebRTCModule.getDisplayMedia().then( + WebRTCModule.getDisplayMedia(constraints).then( data => { const { streamId, track } = data; diff --git a/src/index.ts b/src/index.ts index 7d0983b12..beee29c29 100644 --- a/src/index.ts +++ b/src/index.ts @@ -18,10 +18,18 @@ import MediaStreamTrack, { type MediaTrackSettings } from './MediaStreamTrack'; import MediaStreamTrackEvent from './MediaStreamTrackEvent'; import permissions from './Permissions'; import RTCAudioSession from './RTCAudioSession'; +import RTCCertificate from './RTCCertificate'; +import RTCDataPacketCryptor, { RTCEncryptedPacket } from './RTCDataPacketCryptor'; +import RTCDataPacketCryptorFactory from './RTCDataPacketCryptorFactory'; import RTCErrorEvent from './RTCErrorEvent'; +import RTCFrameCryptor, { RTCFrameCryptorState } from './RTCFrameCryptor'; +import RTCFrameCryptorFactory, { RTCFrameCryptorAlgorithm, RTCKeyProviderOptions } from './RTCFrameCryptorFactory'; import RTCIceCandidate from './RTCIceCandidate'; +import RTCKeyProvider from './RTCKeyProvider'; import RTCPeerConnection from './RTCPeerConnection'; +import RTCRtpEncodingParameters, { type RTCRtpEncodingParametersInit } from './RTCRtpEncodingParameters'; import RTCRtpReceiver from './RTCRtpReceiver'; +import RTCRtpSendParameters, { type RTCRtpSendParametersInit } from './RTCRtpSendParameters'; import RTCRtpSender from './RTCRtpSender'; import RTCRtpTransceiver from './RTCRtpTransceiver'; import RTCSessionDescription from './RTCSessionDescription'; @@ -40,16 +48,30 @@ export { RTCIceCandidate, RTCPeerConnection, RTCSessionDescription, + RTCCertificate, RTCView, ScreenCapturePickerView, + RTCRtpEncodingParameters, RTCRtpTransceiver, RTCRtpReceiver, RTCRtpSender, + RTCRtpSendParameters, RTCErrorEvent, RTCAudioSession, + RTCDataPacketCryptor, + RTCDataPacketCryptorFactory, + RTCEncryptedPacket, + RTCFrameCryptor, + RTCFrameCryptorAlgorithm, + RTCFrameCryptorState, + RTCFrameCryptorFactory, + RTCKeyProvider, + RTCKeyProviderOptions, MediaStream, MediaStreamTrack, type MediaTrackSettings, + type RTCRtpEncodingParametersInit, + type RTCRtpSendParametersInit, mediaDevices, permissions, registerGlobals, @@ -75,6 +97,7 @@ function registerGlobals(): void { global.navigator.mediaDevices.enumerateDevices = mediaDevices.enumerateDevices.bind(mediaDevices); global.RTCIceCandidate = RTCIceCandidate; + global.RTCCertificate = RTCCertificate; global.RTCPeerConnection = RTCPeerConnection; global.RTCRtpReceiver = RTCRtpReceiver; global.RTCRtpSender = RTCRtpReceiver; diff --git a/src/vendor/event-target-shim/LICENSE b/src/vendor/event-target-shim/LICENSE new file mode 100644 index 000000000..c39e6949e --- /dev/null +++ b/src/vendor/event-target-shim/LICENSE @@ -0,0 +1,22 @@ +The MIT License (MIT) + +Copyright (c) 2015 Toru Nagashima + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. + diff --git a/src/vendor/event-target-shim/index.d.ts b/src/vendor/event-target-shim/index.d.ts new file mode 100644 index 000000000..7a5bfc727 --- /dev/null +++ b/src/vendor/event-target-shim/index.d.ts @@ -0,0 +1,411 @@ +// Generated by dts-bundle-generator v5.5.0 + +/** + * Set the error handler. + * @param value The error handler to set. + */ +export declare function setErrorHandler(value: setErrorHandler.ErrorHandler | undefined): void; +export declare namespace setErrorHandler { + /** + * The error handler. + * @param error The thrown error object. + */ + type ErrorHandler = (error: Error) => void; +} +/** + * An implementation of the `EventTarget` interface. + * @see https://dom.spec.whatwg.org/#eventtarget + */ +export declare class EventTarget = Record, TMode extends "standard" | "strict" = "standard"> { + /** + * Initialize this instance. + */ + constructor(); + /** + * Add an event listener. + * @param type The event type. + * @param callback The event listener. + * @param options Options. + */ + addEventListener(type: T, callback?: EventTarget.EventListener | null, options?: EventTarget.AddOptions): void; + /** + * Add an event listener. + * @param type The event type. + * @param callback The event listener. + * @param options Options. + */ + addEventListener(type: string, callback?: EventTarget.FallbackEventListener, options?: EventTarget.AddOptions): void; + /** + * Add an event listener. + * @param type The event type. + * @param callback The event listener. + * @param capture The capture flag. + * @deprecated Use `{capture: boolean}` object instead of a boolean value. + */ + addEventListener(type: T, callback: EventTarget.EventListener | null | undefined, capture: boolean): void; + /** + * Add an event listener. + * @param type The event type. + * @param callback The event listener. + * @param capture The capture flag. + * @deprecated Use `{capture: boolean}` object instead of a boolean value. + */ + addEventListener(type: string, callback: EventTarget.FallbackEventListener, capture: boolean): void; + /** + * Remove an added event listener. + * @param type The event type. + * @param callback The event listener. + * @param options Options. + */ + removeEventListener(type: T, callback?: EventTarget.EventListener | null, options?: EventTarget.Options): void; + /** + * Remove an added event listener. + * @param type The event type. + * @param callback The event listener. + * @param options Options. + */ + removeEventListener(type: string, callback?: EventTarget.FallbackEventListener, options?: EventTarget.Options): void; + /** + * Remove an added event listener. + * @param type The event type. + * @param callback The event listener. + * @param capture The capture flag. + * @deprecated Use `{capture: boolean}` object instead of a boolean value. + */ + removeEventListener(type: T, callback: EventTarget.EventListener | null | undefined, capture: boolean): void; + /** + * Remove an added event listener. + * @param type The event type. + * @param callback The event listener. + * @param capture The capture flag. + * @deprecated Use `{capture: boolean}` object instead of a boolean value. + */ + removeEventListener(type: string, callback: EventTarget.FallbackEventListener, capture: boolean): void; + /** + * Dispatch an event. + * @param event The `Event` object to dispatch. + */ + dispatchEvent(event: EventTarget.EventData): boolean; + /** + * Dispatch an event. + * @param event The `Event` object to dispatch. + */ + dispatchEvent(event: EventTarget.FallbackEvent): boolean; +} +export declare namespace EventTarget { + /** + * The event listener. + */ + type EventListener, TEvent extends Event> = CallbackFunction | CallbackObject; + /** + * The event listener function. + */ + interface CallbackFunction, TEvent extends Event> { + (this: TEventTarget, event: TEvent): void; + } + /** + * The event listener object. + * @see https://dom.spec.whatwg.org/#callbackdef-eventlistener + */ + interface CallbackObject { + handleEvent(event: TEvent): void; + } + /** + * The common options for both `addEventListener` and `removeEventListener` methods. + * @see https://dom.spec.whatwg.org/#dictdef-eventlisteneroptions + */ + interface Options { + capture?: boolean; + } + /** + * The options for the `addEventListener` methods. + * @see https://dom.spec.whatwg.org/#dictdef-addeventlisteneroptions + */ + interface AddOptions extends Options { + passive?: boolean; + once?: boolean; + signal?: AbortSignal | null | undefined; + } + /** + * The abort signal. + * @see https://dom.spec.whatwg.org/#abortsignal + */ + interface AbortSignal extends EventTarget<{ + abort: Event; + }> { + readonly aborted: boolean; + onabort: CallbackFunction | null; + } + /** + * The event data to dispatch in strict mode. + */ + type EventData, TMode extends "standard" | "strict", TEventType extends string> = TMode extends "strict" ? IsValidEventMap extends true ? ExplicitType & Omit & Partial> : never : never; + /** + * Define explicit `type` property if `T` is a string literal. + * Otherwise, never. + */ + type ExplicitType = string extends T ? never : { + readonly type: T; + }; + /** + * The event listener type in standard mode. + * Otherwise, never. + */ + type FallbackEventListener, TMode extends "standard" | "strict"> = TMode extends "standard" ? EventListener | null | undefined : never; + /** + * The event type in standard mode. + * Otherwise, never. + */ + type FallbackEvent = TMode extends "standard" ? Event : never; + /** + * Check if given event map is valid. + * It's valid if the keys of the event map are narrower than `string`. + */ + type IsValidEventMap = string extends keyof T ? false : true; +} +/** + * An implementation of `Event` interface, that wraps a given event object. + * `EventTarget` shim can control the internal state of this `Event` objects. + * @see https://dom.spec.whatwg.org/#event + */ +export declare class Event { + /** + * @see https://dom.spec.whatwg.org/#dom-event-none + */ + static get NONE(): number; + /** + * @see https://dom.spec.whatwg.org/#dom-event-capturing_phase + */ + static get CAPTURING_PHASE(): number; + /** + * @see https://dom.spec.whatwg.org/#dom-event-at_target + */ + static get AT_TARGET(): number; + /** + * @see https://dom.spec.whatwg.org/#dom-event-bubbling_phase + */ + static get BUBBLING_PHASE(): number; + /** + * Initialize this event instance. + * @param type The type of this event. + * @param eventInitDict Options to initialize. + * @see https://dom.spec.whatwg.org/#dom-event-event + */ + constructor(type: TEventType, eventInitDict?: Event.EventInit); + /** + * The type of this event. + * @see https://dom.spec.whatwg.org/#dom-event-type + */ + get type(): TEventType; + /** + * The event target of the current dispatching. + * @see https://dom.spec.whatwg.org/#dom-event-target + */ + get target(): EventTarget | null; + /** + * The event target of the current dispatching. + * @deprecated Use the `target` property instead. + * @see https://dom.spec.whatwg.org/#dom-event-srcelement + */ + get srcElement(): EventTarget | null; + /** + * The event target of the current dispatching. + * @see https://dom.spec.whatwg.org/#dom-event-currenttarget + */ + get currentTarget(): EventTarget | null; + /** + * The event target of the current dispatching. + * This doesn't support node tree. + * @see https://dom.spec.whatwg.org/#dom-event-composedpath + */ + composedPath(): EventTarget[]; + /** + * @see https://dom.spec.whatwg.org/#dom-event-none + */ + get NONE(): number; + /** + * @see https://dom.spec.whatwg.org/#dom-event-capturing_phase + */ + get CAPTURING_PHASE(): number; + /** + * @see https://dom.spec.whatwg.org/#dom-event-at_target + */ + get AT_TARGET(): number; + /** + * @see https://dom.spec.whatwg.org/#dom-event-bubbling_phase + */ + get BUBBLING_PHASE(): number; + /** + * The current event phase. + * @see https://dom.spec.whatwg.org/#dom-event-eventphase + */ + get eventPhase(): number; + /** + * Stop event bubbling. + * Because this shim doesn't support node tree, this merely changes the `cancelBubble` property value. + * @see https://dom.spec.whatwg.org/#dom-event-stoppropagation + */ + stopPropagation(): void; + /** + * `true` if event bubbling was stopped. + * @deprecated + * @see https://dom.spec.whatwg.org/#dom-event-cancelbubble + */ + get cancelBubble(): boolean; + /** + * Stop event bubbling if `true` is set. + * @deprecated Use the `stopPropagation()` method instead. + * @see https://dom.spec.whatwg.org/#dom-event-cancelbubble + */ + set cancelBubble(value: boolean); + /** + * Stop event bubbling and subsequent event listener callings. + * @see https://dom.spec.whatwg.org/#dom-event-stopimmediatepropagation + */ + stopImmediatePropagation(): void; + /** + * `true` if this event will bubble. + * @see https://dom.spec.whatwg.org/#dom-event-bubbles + */ + get bubbles(): boolean; + /** + * `true` if this event can be canceled by the `preventDefault()` method. + * @see https://dom.spec.whatwg.org/#dom-event-cancelable + */ + get cancelable(): boolean; + /** + * `true` if the default behavior will act. + * @deprecated Use the `defaultPrevented` proeprty instead. + * @see https://dom.spec.whatwg.org/#dom-event-returnvalue + */ + get returnValue(): boolean; + /** + * Cancel the default behavior if `false` is set. + * @deprecated Use the `preventDefault()` method instead. + * @see https://dom.spec.whatwg.org/#dom-event-returnvalue + */ + set returnValue(value: boolean); + /** + * Cancel the default behavior. + * @see https://dom.spec.whatwg.org/#dom-event-preventdefault + */ + preventDefault(): void; + /** + * `true` if the default behavior was canceled. + * @see https://dom.spec.whatwg.org/#dom-event-defaultprevented + */ + get defaultPrevented(): boolean; + /** + * @see https://dom.spec.whatwg.org/#dom-event-composed + */ + get composed(): boolean; + /** + * @see https://dom.spec.whatwg.org/#dom-event-istrusted + */ + get isTrusted(): boolean; + /** + * @see https://dom.spec.whatwg.org/#dom-event-timestamp + */ + get timeStamp(): number; + /** + * @deprecated Don't use this method. The constructor did initialization. + */ + initEvent(type: string, bubbles?: boolean, cancelable?: boolean): void; +} +export declare namespace Event { + /** + * The options of the `Event` constructor. + * @see https://dom.spec.whatwg.org/#dictdef-eventinit + */ + interface EventInit { + bubbles?: boolean; + cancelable?: boolean; + composed?: boolean; + } +} +/** + * Get the current value of a given event attribute. + * @param target The `EventTarget` object to get. + * @param type The event type. + */ +export declare function getEventAttributeValue, TEvent extends Event>(target: TEventTarget, type: string): EventTarget.CallbackFunction | null; +/** + * Set an event listener to a given event attribute. + * @param target The `EventTarget` object to set. + * @param type The event type. + * @param callback The event listener. + */ +export declare function setEventAttributeValue(target: EventTarget, type: string, callback: EventTarget.CallbackFunction | null): void; +/** + * Define an `EventTarget` class that has event attibutes. + * @param types The types to define event attributes. + * @deprecated Use `getEventAttributeValue`/`setEventAttributeValue` pair on your derived class instead because of static analysis friendly. + */ +export declare function defineCustomEventTarget, TMode extends "standard" | "strict" = "standard">(...types: (string & keyof TEventMap)[]): defineCustomEventTarget.CustomEventTargetConstructor; +export declare namespace defineCustomEventTarget { + /** + * The interface of CustomEventTarget constructor. + */ + type CustomEventTargetConstructor, TMode extends "standard" | "strict"> = { + /** + * Create a new instance. + */ + new (): CustomEventTarget; + /** + * prototype object. + */ + prototype: CustomEventTarget; + }; + /** + * The interface of CustomEventTarget. + */ + type CustomEventTarget, TMode extends "standard" | "strict"> = EventTarget & defineEventAttribute.EventAttributes; +} +/** + * Define an event attribute. + * @param target The `EventTarget` object to define an event attribute. + * @param type The event type to define. + * @param _eventClass Unused, but to infer `Event` class type. + * @deprecated Use `getEventAttributeValue`/`setEventAttributeValue` pair on your derived class instead because of static analysis friendly. + */ +export declare function defineEventAttribute(target: TEventTarget, type: TEventType, _eventClass?: TEventConstrucor): asserts target is TEventTarget & defineEventAttribute.EventAttributes>>; +export declare namespace defineEventAttribute { + /** + * Definition of event attributes. + */ + type EventAttributes, TEventMap extends Record> = { + [P in string & keyof TEventMap as `on${P}`]: EventTarget.CallbackFunction | null; + }; +} +/** + * Set the warning handler. + * @param value The warning handler to set. + */ +export declare function setWarningHandler(value: setWarningHandler.WarningHandler | undefined): void; +export declare namespace setWarningHandler { + /** + * The warning information. + */ + interface Warning { + /** + * The code of this warning. + */ + code: string; + /** + * The message in English. + */ + message: string; + /** + * The arguments for replacing placeholders in the text. + */ + args: any[]; + } + /** + * The warning handler. + * @param warning The warning. + */ + type WarningHandler = (warning: Warning) => void; +} +export default EventTarget; + +export {}; diff --git a/src/vendor/event-target-shim/index.js b/src/vendor/event-target-shim/index.js new file mode 100644 index 000000000..2bf2cb8b8 --- /dev/null +++ b/src/vendor/event-target-shim/index.js @@ -0,0 +1,1186 @@ +'use strict'; + +Object.defineProperty(exports, '__esModule', { value: true }); + +/** + * Assert a condition. + * @param condition The condition that it should satisfy. + * @param message The error message. + * @param args The arguments for replacing placeholders in the message. + */ +function assertType(condition, message, ...args) { + if (!condition) { + throw new TypeError(format(message, args)); + } +} +/** + * Convert a text and arguments to one string. + * @param message The formating text + * @param args The arguments. + */ +function format(message, args) { + let i = 0; + return message.replace(/%[os]/gu, () => anyToString(args[i++])); +} +/** + * Convert a value to a string representation. + * @param x The value to get the string representation. + */ +function anyToString(x) { + if (typeof x !== "object" || x === null) { + return String(x); + } + return Object.prototype.toString.call(x); +} + +let currentErrorHandler; +/** + * Set the error handler. + * @param value The error handler to set. + */ +function setErrorHandler(value) { + assertType(typeof value === "function" || value === undefined, "The error handler must be a function or undefined, but got %o.", value); + currentErrorHandler = value; +} +/** + * Print a error message. + * @param maybeError The error object. + */ +function reportError(maybeError) { + try { + const error = maybeError instanceof Error + ? maybeError + : new Error(anyToString(maybeError)); + // Call the user-defined error handler if exists. + if (currentErrorHandler) { + currentErrorHandler(error); + return; + } + // Dispatch an `error` event if this is on a browser. + if (typeof dispatchEvent === "function" && + typeof ErrorEvent === "function") { + dispatchEvent(new ErrorEvent("error", { error, message: error.message })); + } + // Emit an `uncaughtException` event if this is on Node.js. + //istanbul ignore else + else if (typeof process !== "undefined" && + typeof process.emit === "function") { + process.emit("uncaughtException", error); + return; + } + // Otherwise, print the error. + console.error(error); + } + catch (_a) { + // ignore. + } +} + +/** + * The global object. + */ +//istanbul ignore next +const Global = typeof window !== "undefined" + ? window + : typeof self !== "undefined" + ? self + : typeof global !== "undefined" + ? global + : typeof globalThis !== "undefined" + ? globalThis + : undefined; + +let currentWarnHandler; +/** + * Set the warning handler. + * @param value The warning handler to set. + */ +function setWarningHandler(value) { + assertType(typeof value === "function" || value === undefined, "The warning handler must be a function or undefined, but got %o.", value); + currentWarnHandler = value; +} +/** + * The warning information. + */ +class Warning { + constructor(code, message) { + this.code = code; + this.message = message; + } + /** + * Report this warning. + * @param args The arguments of the warning. + */ + warn(...args) { + var _a; + try { + // Call the user-defined warning handler if exists. + if (currentWarnHandler) { + currentWarnHandler({ ...this, args }); + return; + } + // Otherwise, print the warning. + const stack = ((_a = new Error().stack) !== null && _a !== void 0 ? _a : "").replace(/^(?:.+?\n){2}/gu, "\n"); + console.warn(this.message, ...args, stack); + } + catch (_b) { + // Ignore. + } + } +} + +const InitEventWasCalledWhileDispatching = new Warning("W01", "Unable to initialize event under dispatching."); +const FalsyWasAssignedToCancelBubble = new Warning("W02", "Assigning any falsy value to 'cancelBubble' property has no effect."); +const TruthyWasAssignedToReturnValue = new Warning("W03", "Assigning any truthy value to 'returnValue' property has no effect."); +const NonCancelableEventWasCanceled = new Warning("W04", "Unable to preventDefault on non-cancelable events."); +const CanceledInPassiveListener = new Warning("W05", "Unable to preventDefault inside passive event listener invocation."); +const EventListenerWasDuplicated = new Warning("W06", "An event listener wasn't added because it has been added already: %o, %o"); +const OptionWasIgnored = new Warning("W07", "The %o option value was abandoned because the event listener wasn't added as duplicated."); +const InvalidEventListener = new Warning("W08", "The 'callback' argument must be a function or an object that has 'handleEvent' method: %o"); +const InvalidAttributeHandler = new Warning("W09", "Event attribute handler must be a function: %o"); + +/*eslint-disable class-methods-use-this */ +/** + * An implementation of `Event` interface, that wraps a given event object. + * `EventTarget` shim can control the internal state of this `Event` objects. + * @see https://dom.spec.whatwg.org/#event + */ +class Event { + /** + * @see https://dom.spec.whatwg.org/#dom-event-none + */ + static get NONE() { + return NONE; + } + /** + * @see https://dom.spec.whatwg.org/#dom-event-capturing_phase + */ + static get CAPTURING_PHASE() { + return CAPTURING_PHASE; + } + /** + * @see https://dom.spec.whatwg.org/#dom-event-at_target + */ + static get AT_TARGET() { + return AT_TARGET; + } + /** + * @see https://dom.spec.whatwg.org/#dom-event-bubbling_phase + */ + static get BUBBLING_PHASE() { + return BUBBLING_PHASE; + } + /** + * Initialize this event instance. + * @param type The type of this event. + * @param eventInitDict Options to initialize. + * @see https://dom.spec.whatwg.org/#dom-event-event + */ + constructor(type, eventInitDict) { + Object.defineProperty(this, "isTrusted", { + value: false, + enumerable: true, + }); + const opts = eventInitDict !== null && eventInitDict !== void 0 ? eventInitDict : {}; + internalDataMap.set(this, { + type: String(type), + bubbles: Boolean(opts.bubbles), + cancelable: Boolean(opts.cancelable), + composed: Boolean(opts.composed), + target: null, + currentTarget: null, + stopPropagationFlag: false, + stopImmediatePropagationFlag: false, + canceledFlag: false, + inPassiveListenerFlag: false, + dispatchFlag: false, + timeStamp: Date.now(), + }); + } + /** + * The type of this event. + * @see https://dom.spec.whatwg.org/#dom-event-type + */ + get type() { + return $(this).type; + } + /** + * The event target of the current dispatching. + * @see https://dom.spec.whatwg.org/#dom-event-target + */ + get target() { + return $(this).target; + } + /** + * The event target of the current dispatching. + * @deprecated Use the `target` property instead. + * @see https://dom.spec.whatwg.org/#dom-event-srcelement + */ + get srcElement() { + return $(this).target; + } + /** + * The event target of the current dispatching. + * @see https://dom.spec.whatwg.org/#dom-event-currenttarget + */ + get currentTarget() { + return $(this).currentTarget; + } + /** + * The event target of the current dispatching. + * This doesn't support node tree. + * @see https://dom.spec.whatwg.org/#dom-event-composedpath + */ + composedPath() { + const currentTarget = $(this).currentTarget; + if (currentTarget) { + return [currentTarget]; + } + return []; + } + /** + * @see https://dom.spec.whatwg.org/#dom-event-none + */ + get NONE() { + return NONE; + } + /** + * @see https://dom.spec.whatwg.org/#dom-event-capturing_phase + */ + get CAPTURING_PHASE() { + return CAPTURING_PHASE; + } + /** + * @see https://dom.spec.whatwg.org/#dom-event-at_target + */ + get AT_TARGET() { + return AT_TARGET; + } + /** + * @see https://dom.spec.whatwg.org/#dom-event-bubbling_phase + */ + get BUBBLING_PHASE() { + return BUBBLING_PHASE; + } + /** + * The current event phase. + * @see https://dom.spec.whatwg.org/#dom-event-eventphase + */ + get eventPhase() { + return $(this).dispatchFlag ? 2 : 0; + } + /** + * Stop event bubbling. + * Because this shim doesn't support node tree, this merely changes the `cancelBubble` property value. + * @see https://dom.spec.whatwg.org/#dom-event-stoppropagation + */ + stopPropagation() { + $(this).stopPropagationFlag = true; + } + /** + * `true` if event bubbling was stopped. + * @deprecated + * @see https://dom.spec.whatwg.org/#dom-event-cancelbubble + */ + get cancelBubble() { + return $(this).stopPropagationFlag; + } + /** + * Stop event bubbling if `true` is set. + * @deprecated Use the `stopPropagation()` method instead. + * @see https://dom.spec.whatwg.org/#dom-event-cancelbubble + */ + set cancelBubble(value) { + if (value) { + $(this).stopPropagationFlag = true; + } + else { + FalsyWasAssignedToCancelBubble.warn(); + } + } + /** + * Stop event bubbling and subsequent event listener callings. + * @see https://dom.spec.whatwg.org/#dom-event-stopimmediatepropagation + */ + stopImmediatePropagation() { + const data = $(this); + data.stopPropagationFlag = data.stopImmediatePropagationFlag = true; + } + /** + * `true` if this event will bubble. + * @see https://dom.spec.whatwg.org/#dom-event-bubbles + */ + get bubbles() { + return $(this).bubbles; + } + /** + * `true` if this event can be canceled by the `preventDefault()` method. + * @see https://dom.spec.whatwg.org/#dom-event-cancelable + */ + get cancelable() { + return $(this).cancelable; + } + /** + * `true` if the default behavior will act. + * @deprecated Use the `defaultPrevented` proeprty instead. + * @see https://dom.spec.whatwg.org/#dom-event-returnvalue + */ + get returnValue() { + return !$(this).canceledFlag; + } + /** + * Cancel the default behavior if `false` is set. + * @deprecated Use the `preventDefault()` method instead. + * @see https://dom.spec.whatwg.org/#dom-event-returnvalue + */ + set returnValue(value) { + if (!value) { + setCancelFlag($(this)); + } + else { + TruthyWasAssignedToReturnValue.warn(); + } + } + /** + * Cancel the default behavior. + * @see https://dom.spec.whatwg.org/#dom-event-preventdefault + */ + preventDefault() { + setCancelFlag($(this)); + } + /** + * `true` if the default behavior was canceled. + * @see https://dom.spec.whatwg.org/#dom-event-defaultprevented + */ + get defaultPrevented() { + return $(this).canceledFlag; + } + /** + * @see https://dom.spec.whatwg.org/#dom-event-composed + */ + get composed() { + return $(this).composed; + } + /** + * @see https://dom.spec.whatwg.org/#dom-event-istrusted + */ + //istanbul ignore next + get isTrusted() { + return false; + } + /** + * @see https://dom.spec.whatwg.org/#dom-event-timestamp + */ + get timeStamp() { + return $(this).timeStamp; + } + /** + * @deprecated Don't use this method. The constructor did initialization. + */ + initEvent(type, bubbles = false, cancelable = false) { + const data = $(this); + if (data.dispatchFlag) { + InitEventWasCalledWhileDispatching.warn(); + return; + } + internalDataMap.set(this, { + ...data, + type: String(type), + bubbles: Boolean(bubbles), + cancelable: Boolean(cancelable), + target: null, + currentTarget: null, + stopPropagationFlag: false, + stopImmediatePropagationFlag: false, + canceledFlag: false, + }); + } +} +//------------------------------------------------------------------------------ +// Helpers +//------------------------------------------------------------------------------ +const NONE = 0; +const CAPTURING_PHASE = 1; +const AT_TARGET = 2; +const BUBBLING_PHASE = 3; +/** + * Private data for event wrappers. + */ +const internalDataMap = new WeakMap(); +/** + * Get private data. + * @param event The event object to get private data. + * @param name The variable name to report. + * @returns The private data of the event. + */ +function $(event, name = "this") { + const retv = internalDataMap.get(event); + assertType(retv != null, "'%s' must be an object that Event constructor created, but got another one: %o", name, event); + return retv; +} +/** + * https://dom.spec.whatwg.org/#set-the-canceled-flag + * @param data private data. + */ +function setCancelFlag(data) { + if (data.inPassiveListenerFlag) { + CanceledInPassiveListener.warn(); + return; + } + if (!data.cancelable) { + NonCancelableEventWasCanceled.warn(); + return; + } + data.canceledFlag = true; +} +// Set enumerable +Object.defineProperty(Event, "NONE", { enumerable: true }); +Object.defineProperty(Event, "CAPTURING_PHASE", { enumerable: true }); +Object.defineProperty(Event, "AT_TARGET", { enumerable: true }); +Object.defineProperty(Event, "BUBBLING_PHASE", { enumerable: true }); +const keys = Object.getOwnPropertyNames(Event.prototype); +for (let i = 0; i < keys.length; ++i) { + if (keys[i] === "constructor") { + continue; + } + Object.defineProperty(Event.prototype, keys[i], { enumerable: true }); +} +// Ensure `event instanceof window.Event` is `true`. +if (typeof Global !== "undefined" && typeof Global.Event !== "undefined") { + Object.setPrototypeOf(Event.prototype, Global.Event.prototype); +} + +/** + * Create a new InvalidStateError instance. + * @param message The error message. + */ +function createInvalidStateError(message) { + if (Global.DOMException) { + return new Global.DOMException(message, "InvalidStateError"); + } + if (DOMException == null) { + DOMException = class DOMException extends Error { + constructor(msg) { + super(msg); + if (Error.captureStackTrace) { + Error.captureStackTrace(this, DOMException); + } + } + // eslint-disable-next-line class-methods-use-this + get code() { + return 11; + } + // eslint-disable-next-line class-methods-use-this + get name() { + return "InvalidStateError"; + } + }; + Object.defineProperties(DOMException.prototype, { + code: { enumerable: true }, + name: { enumerable: true }, + }); + defineErrorCodeProperties(DOMException); + defineErrorCodeProperties(DOMException.prototype); + } + return new DOMException(message); +} +//------------------------------------------------------------------------------ +// Helpers +//------------------------------------------------------------------------------ +let DOMException; +const ErrorCodeMap = { + INDEX_SIZE_ERR: 1, + DOMSTRING_SIZE_ERR: 2, + HIERARCHY_REQUEST_ERR: 3, + WRONG_DOCUMENT_ERR: 4, + INVALID_CHARACTER_ERR: 5, + NO_DATA_ALLOWED_ERR: 6, + NO_MODIFICATION_ALLOWED_ERR: 7, + NOT_FOUND_ERR: 8, + NOT_SUPPORTED_ERR: 9, + INUSE_ATTRIBUTE_ERR: 10, + INVALID_STATE_ERR: 11, + SYNTAX_ERR: 12, + INVALID_MODIFICATION_ERR: 13, + NAMESPACE_ERR: 14, + INVALID_ACCESS_ERR: 15, + VALIDATION_ERR: 16, + TYPE_MISMATCH_ERR: 17, + SECURITY_ERR: 18, + NETWORK_ERR: 19, + ABORT_ERR: 20, + URL_MISMATCH_ERR: 21, + QUOTA_EXCEEDED_ERR: 22, + TIMEOUT_ERR: 23, + INVALID_NODE_TYPE_ERR: 24, + DATA_CLONE_ERR: 25, +}; +function defineErrorCodeProperties(obj) { + const keys = Object.keys(ErrorCodeMap); + for (let i = 0; i < keys.length; ++i) { + const key = keys[i]; + const value = ErrorCodeMap[key]; + Object.defineProperty(obj, key, { + get() { + return value; + }, + configurable: true, + enumerable: true, + }); + } +} + +/** + * An implementation of `Event` interface, that wraps a given event object. + * This class controls the internal state of `Event`. + * @see https://dom.spec.whatwg.org/#interface-event + */ +class EventWrapper extends Event { + /** + * Wrap a given event object to control states. + * @param event The event-like object to wrap. + */ + static wrap(event) { + return new (getWrapperClassOf(event))(event); + } + constructor(event) { + super(event.type, { + bubbles: event.bubbles, + cancelable: event.cancelable, + composed: event.composed, + }); + if (event.cancelBubble) { + super.stopPropagation(); + } + if (event.defaultPrevented) { + super.preventDefault(); + } + internalDataMap$1.set(this, { original: event }); + // Define accessors + const keys = Object.keys(event); + for (let i = 0; i < keys.length; ++i) { + const key = keys[i]; + if (!(key in this)) { + Object.defineProperty(this, key, defineRedirectDescriptor(event, key)); + } + } + } + stopPropagation() { + super.stopPropagation(); + const { original } = $$1(this); + if ("stopPropagation" in original) { + original.stopPropagation(); + } + } + get cancelBubble() { + return super.cancelBubble; + } + set cancelBubble(value) { + super.cancelBubble = value; + const { original } = $$1(this); + if ("cancelBubble" in original) { + original.cancelBubble = value; + } + } + stopImmediatePropagation() { + super.stopImmediatePropagation(); + const { original } = $$1(this); + if ("stopImmediatePropagation" in original) { + original.stopImmediatePropagation(); + } + } + get returnValue() { + return super.returnValue; + } + set returnValue(value) { + super.returnValue = value; + const { original } = $$1(this); + if ("returnValue" in original) { + original.returnValue = value; + } + } + preventDefault() { + super.preventDefault(); + const { original } = $$1(this); + if ("preventDefault" in original) { + original.preventDefault(); + } + } + get timeStamp() { + const { original } = $$1(this); + if ("timeStamp" in original) { + return original.timeStamp; + } + return super.timeStamp; + } +} +/** + * Private data for event wrappers. + */ +const internalDataMap$1 = new WeakMap(); +/** + * Get private data. + * @param event The event object to get private data. + * @returns The private data of the event. + */ +function $$1(event) { + const retv = internalDataMap$1.get(event); + assertType(retv != null, "'this' is expected an Event object, but got", event); + return retv; +} +/** + * Cache for wrapper classes. + * @type {WeakMap} + * @private + */ +const wrapperClassCache = new WeakMap(); +// Make association for wrappers. +wrapperClassCache.set(Object.prototype, EventWrapper); +if (typeof Global !== "undefined" && typeof Global.Event !== "undefined") { + wrapperClassCache.set(Global.Event.prototype, EventWrapper); +} +/** + * Get the wrapper class of a given prototype. + * @param originalEvent The event object to wrap. + */ +function getWrapperClassOf(originalEvent) { + const prototype = Object.getPrototypeOf(originalEvent); + if (prototype == null) { + return EventWrapper; + } + let wrapper = wrapperClassCache.get(prototype); + if (wrapper == null) { + wrapper = defineWrapper(getWrapperClassOf(prototype), prototype); + wrapperClassCache.set(prototype, wrapper); + } + return wrapper; +} +/** + * Define new wrapper class. + * @param BaseEventWrapper The base wrapper class. + * @param originalPrototype The prototype of the original event. + */ +function defineWrapper(BaseEventWrapper, originalPrototype) { + class CustomEventWrapper extends BaseEventWrapper { + } + const keys = Object.keys(originalPrototype); + for (let i = 0; i < keys.length; ++i) { + Object.defineProperty(CustomEventWrapper.prototype, keys[i], defineRedirectDescriptor(originalPrototype, keys[i])); + } + return CustomEventWrapper; +} +/** + * Get the property descriptor to redirect a given property. + */ +function defineRedirectDescriptor(obj, key) { + const d = Object.getOwnPropertyDescriptor(obj, key); + return { + get() { + const original = $$1(this).original; + const value = original[key]; + if (typeof value === "function") { + return value.bind(original); + } + return value; + }, + set(value) { + const original = $$1(this).original; + original[key] = value; + }, + configurable: d.configurable, + enumerable: d.enumerable, + }; +} + +/** + * Create a new listener. + * @param callback The callback function. + * @param capture The capture flag. + * @param passive The passive flag. + * @param once The once flag. + * @param signal The abort signal. + * @param signalListener The abort event listener for the abort signal. + */ +function createListener(callback, capture, passive, once, signal, signalListener) { + return { + callback, + flags: (capture ? 1 /* Capture */ : 0) | + (passive ? 2 /* Passive */ : 0) | + (once ? 4 /* Once */ : 0), + signal, + signalListener, + }; +} +/** + * Set the `removed` flag to the given listener. + * @param listener The listener to check. + */ +function setRemoved(listener) { + listener.flags |= 8 /* Removed */; +} +/** + * Check if the given listener has the `capture` flag or not. + * @param listener The listener to check. + */ +function isCapture(listener) { + return (listener.flags & 1 /* Capture */) === 1 /* Capture */; +} +/** + * Check if the given listener has the `passive` flag or not. + * @param listener The listener to check. + */ +function isPassive(listener) { + return (listener.flags & 2 /* Passive */) === 2 /* Passive */; +} +/** + * Check if the given listener has the `once` flag or not. + * @param listener The listener to check. + */ +function isOnce(listener) { + return (listener.flags & 4 /* Once */) === 4 /* Once */; +} +/** + * Check if the given listener has the `removed` flag or not. + * @param listener The listener to check. + */ +function isRemoved(listener) { + return (listener.flags & 8 /* Removed */) === 8 /* Removed */; +} +/** + * Call an event listener. + * @param listener The listener to call. + * @param target The event target object for `thisArg`. + * @param event The event object for the first argument. + * @param attribute `true` if this callback is an event attribute handler. + */ +function invokeCallback({ callback }, target, event) { + try { + if (typeof callback === "function") { + callback.call(target, event); + } + else if (typeof callback.handleEvent === "function") { + callback.handleEvent(event); + } + } + catch (thrownError) { + reportError(thrownError); + } +} + +/** + * Find the index of given listener. + * This returns `-1` if not found. + * @param list The listener list. + * @param callback The callback function to find. + * @param capture The capture flag to find. + */ +function findIndexOfListener({ listeners }, callback, capture) { + for (let i = 0; i < listeners.length; ++i) { + if (listeners[i].callback === callback && + isCapture(listeners[i]) === capture) { + return i; + } + } + return -1; +} +/** + * Add the given listener. + * Does copy-on-write if needed. + * @param list The listener list. + * @param callback The callback function. + * @param capture The capture flag. + * @param passive The passive flag. + * @param once The once flag. + * @param signal The abort signal. + */ +function addListener(list, callback, capture, passive, once, signal) { + let signalListener; + if (signal) { + signalListener = removeListener.bind(null, list, callback, capture); + signal.addEventListener("abort", signalListener); + } + const listener = createListener(callback, capture, passive, once, signal, signalListener); + if (list.cow) { + list.cow = false; + list.listeners = [...list.listeners, listener]; + } + else { + list.listeners.push(listener); + } + return listener; +} +/** + * Remove a listener. + * @param list The listener list. + * @param callback The callback function to find. + * @param capture The capture flag to find. + * @returns `true` if it mutated the list directly. + */ +function removeListener(list, callback, capture) { + const index = findIndexOfListener(list, callback, capture); + if (index !== -1) { + return removeListenerAt(list, index); + } + return false; +} +/** + * Remove a listener. + * @param list The listener list. + * @param index The index of the target listener. + * @param disableCow Disable copy-on-write if true. + * @returns `true` if it mutated the `listeners` array directly. + */ +function removeListenerAt(list, index, disableCow = false) { + const listener = list.listeners[index]; + // Set the removed flag. + setRemoved(listener); + // Dispose the abort signal listener if exists. + if (listener.signal) { + listener.signal.removeEventListener("abort", listener.signalListener); + } + // Remove it from the array. + if (list.cow && !disableCow) { + list.cow = false; + list.listeners = list.listeners.filter((_, i) => i !== index); + return false; + } + list.listeners.splice(index, 1); + return true; +} + +/** + * Create a new `ListenerListMap` object. + */ +function createListenerListMap() { + return Object.create(null); +} +/** + * Get the listener list of the given type. + * If the listener list has not been initialized, initialize and return it. + * @param listenerMap The listener list map. + * @param type The event type to get. + */ +function ensureListenerList(listenerMap, type) { + var _a; + return ((_a = listenerMap[type]) !== null && _a !== void 0 ? _a : (listenerMap[type] = { + attrCallback: undefined, + attrListener: undefined, + cow: false, + listeners: [], + })); +} + +/** + * An implementation of the `EventTarget` interface. + * @see https://dom.spec.whatwg.org/#eventtarget + */ +class EventTarget { + /** + * Initialize this instance. + */ + constructor() { + internalDataMap$2.set(this, createListenerListMap()); + } + // Implementation + addEventListener(type0, callback0, options0) { + const listenerMap = $$2(this); + const { callback, capture, once, passive, signal, type, } = normalizeAddOptions(type0, callback0, options0); + if (callback == null || (signal === null || signal === void 0 ? void 0 : signal.aborted)) { + return; + } + const list = ensureListenerList(listenerMap, type); + // Find existing listener. + const i = findIndexOfListener(list, callback, capture); + if (i !== -1) { + warnDuplicate(list.listeners[i], passive, once, signal); + return; + } + // Add the new listener. + addListener(list, callback, capture, passive, once, signal); + } + // Implementation + removeEventListener(type0, callback0, options0) { + const listenerMap = $$2(this); + const { callback, capture, type } = normalizeOptions(type0, callback0, options0); + const list = listenerMap[type]; + if (callback != null && list) { + removeListener(list, callback, capture); + } + } + // Implementation + dispatchEvent(e) { + const list = $$2(this)[String(e.type)]; + if (list == null) { + return true; + } + const event = e instanceof Event ? e : EventWrapper.wrap(e); + const eventData = $(event, "event"); + if (eventData.dispatchFlag) { + throw createInvalidStateError("This event has been in dispatching."); + } + eventData.dispatchFlag = true; + eventData.target = eventData.currentTarget = this; + if (!eventData.stopPropagationFlag) { + const { cow, listeners } = list; + // Set copy-on-write flag. + list.cow = true; + // Call listeners. + for (let i = 0; i < listeners.length; ++i) { + const listener = listeners[i]; + // Skip if removed. + if (isRemoved(listener)) { + continue; + } + // Remove this listener if has the `once` flag. + if (isOnce(listener) && removeListenerAt(list, i, !cow)) { + // Because this listener was removed, the next index is the + // same as the current value. + i -= 1; + } + // Call this listener with the `passive` flag. + eventData.inPassiveListenerFlag = isPassive(listener); + invokeCallback(listener, this, event); + eventData.inPassiveListenerFlag = false; + // Stop if the `event.stopImmediatePropagation()` method was called. + if (eventData.stopImmediatePropagationFlag) { + break; + } + } + // Restore copy-on-write flag. + if (!cow) { + list.cow = false; + } + } + eventData.target = null; + eventData.currentTarget = null; + eventData.stopImmediatePropagationFlag = false; + eventData.stopPropagationFlag = false; + eventData.dispatchFlag = false; + return !eventData.canceledFlag; + } +} +/** + * Internal data. + */ +const internalDataMap$2 = new WeakMap(); +/** + * Get private data. + * @param target The event target object to get private data. + * @param name The variable name to report. + * @returns The private data of the event. + */ +function $$2(target, name = "this") { + const retv = internalDataMap$2.get(target); + assertType(retv != null, "'%s' must be an object that EventTarget constructor created, but got another one: %o", name, target); + return retv; +} +/** + * Normalize options. + * @param options The options to normalize. + */ +function normalizeAddOptions(type, callback, options) { + var _a; + assertCallback(callback); + if (typeof options === "object" && options !== null) { + return { + type: String(type), + callback: callback !== null && callback !== void 0 ? callback : undefined, + capture: Boolean(options.capture), + passive: Boolean(options.passive), + once: Boolean(options.once), + signal: (_a = options.signal) !== null && _a !== void 0 ? _a : undefined, + }; + } + return { + type: String(type), + callback: callback !== null && callback !== void 0 ? callback : undefined, + capture: Boolean(options), + passive: false, + once: false, + signal: undefined, + }; +} +/** + * Normalize options. + * @param options The options to normalize. + */ +function normalizeOptions(type, callback, options) { + assertCallback(callback); + if (typeof options === "object" && options !== null) { + return { + type: String(type), + callback: callback !== null && callback !== void 0 ? callback : undefined, + capture: Boolean(options.capture), + }; + } + return { + type: String(type), + callback: callback !== null && callback !== void 0 ? callback : undefined, + capture: Boolean(options), + }; +} +/** + * Assert the type of 'callback' argument. + * @param callback The callback to check. + */ +function assertCallback(callback) { + if (typeof callback === "function" || + (typeof callback === "object" && + callback !== null && + typeof callback.handleEvent === "function")) { + return; + } + if (callback == null || typeof callback === "object") { + InvalidEventListener.warn(callback); + return; + } + throw new TypeError(format(InvalidEventListener.message, [callback])); +} +/** + * Print warning for duplicated. + * @param listener The current listener that is duplicated. + * @param passive The passive flag of the new duplicated listener. + * @param once The once flag of the new duplicated listener. + * @param signal The signal object of the new duplicated listener. + */ +function warnDuplicate(listener, passive, once, signal) { + EventListenerWasDuplicated.warn(isCapture(listener) ? "capture" : "bubble", listener.callback); + if (isPassive(listener) !== passive) { + OptionWasIgnored.warn("passive"); + } + if (isOnce(listener) !== once) { + OptionWasIgnored.warn("once"); + } + if (listener.signal !== signal) { + OptionWasIgnored.warn("signal"); + } +} +// Set enumerable +const keys$1 = Object.getOwnPropertyNames(EventTarget.prototype); +for (let i = 0; i < keys$1.length; ++i) { + if (keys$1[i] === "constructor") { + continue; + } + Object.defineProperty(EventTarget.prototype, keys$1[i], { enumerable: true }); +} +// Ensure `eventTarget instanceof window.EventTarget` is `true`. +if (typeof Global !== "undefined" && + typeof Global.EventTarget !== "undefined") { + Object.setPrototypeOf(EventTarget.prototype, Global.EventTarget.prototype); +} + +/** + * Get the current value of a given event attribute. + * @param target The `EventTarget` object to get. + * @param type The event type. + */ +function getEventAttributeValue(target, type) { + var _a, _b; + const listMap = $$2(target, "target"); + return (_b = (_a = listMap[type]) === null || _a === void 0 ? void 0 : _a.attrCallback) !== null && _b !== void 0 ? _b : null; +} +/** + * Set an event listener to a given event attribute. + * @param target The `EventTarget` object to set. + * @param type The event type. + * @param callback The event listener. + */ +function setEventAttributeValue(target, type, callback) { + if (callback != null && typeof callback !== "function") { + InvalidAttributeHandler.warn(callback); + } + if (typeof callback === "function" || + (typeof callback === "object" && callback !== null)) { + upsertEventAttributeListener(target, type, callback); + } + else { + removeEventAttributeListener(target, type); + } +} +//------------------------------------------------------------------------------ +// Helpers +//------------------------------------------------------------------------------ +/** + * Update or insert the given event attribute handler. + * @param target The `EventTarget` object to set. + * @param type The event type. + * @param callback The event listener. + */ +function upsertEventAttributeListener(target, type, callback) { + const list = ensureListenerList($$2(target, "target"), String(type)); + list.attrCallback = callback; + if (list.attrListener == null) { + list.attrListener = addListener(list, defineEventAttributeCallback(list), false, false, false, undefined); + } +} +/** + * Remove the given event attribute handler. + * @param target The `EventTarget` object to remove. + * @param type The event type. + * @param callback The event listener. + */ +function removeEventAttributeListener(target, type) { + const listMap = $$2(target, "target"); + const list = listMap[String(type)]; + if (list && list.attrListener) { + removeListener(list, list.attrListener.callback, false); + list.attrCallback = list.attrListener = undefined; + } +} +/** + * Define the callback function for the given listener list object. + * It calls `attrCallback` property if the property value is a function. + * @param list The `ListenerList` object. + */ +function defineEventAttributeCallback(list) { + return function (event) { + const callback = list.attrCallback; + if (typeof callback === "function") { + callback.call(this, event); + } + }; +} + +/** + * Define an `EventTarget` class that has event attibutes. + * @param types The types to define event attributes. + * @deprecated Use `getEventAttributeValue`/`setEventAttributeValue` pair on your derived class instead because of static analysis friendly. + */ +function defineCustomEventTarget(...types) { + class CustomEventTarget extends EventTarget { + } + for (let i = 0; i < types.length; ++i) { + defineEventAttribute(CustomEventTarget.prototype, types[i]); + } + return CustomEventTarget; +} +/** + * Define an event attribute. + * @param target The `EventTarget` object to define an event attribute. + * @param type The event type to define. + * @param _eventClass Unused, but to infer `Event` class type. + * @deprecated Use `getEventAttributeValue`/`setEventAttributeValue` pair on your derived class instead because of static analysis friendly. + */ +function defineEventAttribute(target, type, _eventClass) { + Object.defineProperty(target, `on${type}`, { + get() { + return getEventAttributeValue(this, type); + }, + set(value) { + setEventAttributeValue(this, type, value); + }, + configurable: true, + enumerable: true, + }); +} + +exports.Event = Event; +exports.EventTarget = EventTarget; +exports.default = EventTarget; +exports.defineCustomEventTarget = defineCustomEventTarget; +exports.defineEventAttribute = defineEventAttribute; +exports.getEventAttributeValue = getEventAttributeValue; +exports.setErrorHandler = setErrorHandler; +exports.setEventAttributeValue = setEventAttributeValue; +exports.setWarningHandler = setWarningHandler; +//# sourceMappingURL=index.js.map diff --git a/tools/build-webrtc.py b/tools/build-webrtc.py deleted file mode 100644 index 92a162e87..000000000 --- a/tools/build-webrtc.py +++ /dev/null @@ -1,344 +0,0 @@ -from __future__ import print_function - -import argparse -import errno -import os -import shutil -import subprocess -import sys - - -# Constants - -APPLE_FRAMEWORK_NAME = 'WebRTC.framework' -APPLE_DSYM_NAME = 'WebRTC.dSYM' - -ANDROID_CPU_ABI_MAP = { - 'arm' : 'armeabi-v7a', - 'arm64' : 'arm64-v8a', - 'x86' : 'x86', - 'x64' : 'x86_64' -} -ANDROID_BUILD_CPUS = [ - 'arm', - 'arm64', - 'x86', - 'x64' -] -IOS_BUILD_ARCHS = [ - 'device:arm64', - 'simulator:arm64', - 'simulator:x64' -] -MACOS_BUILD_ARCHS = [ - 'x64' -] - -def build_gn_args(platform_args): - return "--args='" + ' '.join(GN_COMMON_ARGS + platform_args) + "'" - -GN_COMMON_ARGS = [ - 'rtc_libvpx_build_vp9=true', - 'rtc_enable_protobuf=false', - 'rtc_include_tests=false', - 'is_debug=%s', - 'target_cpu="%s"' -] - -_GN_APPLE_COMMON = [ - 'enable_dsyms=true', - 'enable_stripping=true', - 'rtc_enable_symbol_export=false', - 'rtc_enable_objc_symbol_export=true' -] - -_GN_IOS_ARGS = [ - 'ios_deployment_target="12.0"', - 'ios_enable_code_signing=false', - 'target_os="ios"', - 'target_environment="%s"' -] -GN_IOS_ARGS = build_gn_args(_GN_APPLE_COMMON + _GN_IOS_ARGS) - -_GN_MACOS_ARGS = [ - 'target_os="mac"' -] -GN_MACOS_ARGS = build_gn_args(_GN_APPLE_COMMON + _GN_MACOS_ARGS) - -_GN_ANDROID_ARGS = [ - 'target_os="android"' -] -GN_ANDROID_ARGS = build_gn_args(_GN_ANDROID_ARGS) - - -# Utilities - -def sh(cmd, env=None, cwd=None): - print('Running cmd: %s' % cmd) - try: - subprocess.check_call(cmd, env=env, cwd=cwd, shell=True, stdin=sys.stdin, stdout=sys.stdout, stderr=subprocess.STDOUT) - except subprocess.CalledProcessError as e: - sys.exit(e.returncode) - except KeyboardInterrupt: - pass - -def mkdirp(path): - try: - os.makedirs(path) - except OSError as e: - if e.errno != errno.EEXIST: - raise - -def rmr(path): - try: - shutil.rmtree(path) - except OSError as e: - if e.errno != errno.ENOENT: - raise - - -# The Real Deal - -def setup(target_dir, platform): - mkdirp(target_dir) - os.chdir(target_dir) - - # Maybe fetch depot_tools - depot_tools_dir = os.path.join(target_dir, 'depot_tools') - if not os.path.isdir(depot_tools_dir): - print('Fetching Chromium depot_tools...') - sh('git clone https://chromium.googlesource.com/chromium/tools/depot_tools.git') - - # Prepare environment - env = os.environ.copy() - env['PATH'] = '%s:%s' % (env['PATH'], depot_tools_dir) - - # Maybe fetch WebRTC - webrtc_dir = os.path.join(target_dir, 'webrtc', platform) - if not os.path.isdir(webrtc_dir): - mkdirp(webrtc_dir) - os.chdir(webrtc_dir) - print('Fetching WebRTC for %s...' % platform) - sh('fetch --nohooks webrtc_%s' % platform, env) - - # Run gclient - sh('gclient sync', env) - - # Install dependencies - if platform == 'android': - webrtc_dir = os.path.join(target_dir, 'webrtc', platform, 'src') - os.chdir(webrtc_dir) - sh('./build/install-build-deps.sh') - - -def sync(target_dir, platform): - build_dir = os.path.join(target_dir, 'build', platform) - depot_tools_dir = os.path.join(target_dir, 'depot_tools') - webrtc_dir = os.path.join(target_dir, 'webrtc', platform, 'src') - - if not os.path.isdir(webrtc_dir): - print('WebRTC source not found, did you forget to run --setup?') - sys.exit(1) - - # Prepare environment - env = os.environ.copy() - path_parts = [env['PATH'], depot_tools_dir] - if platform == 'android': - # Same as . build/android/envsetup.sh - android_sdk_root = os.path.join(webrtc_dir, 'third_party/android_sdk/public') - path_parts.append(os.path.join(android_sdk_root, 'platform-tools')) - path_parts.append(os.path.join(android_sdk_root, 'tools')) - path_parts.append(os.path.join(webrtc_dir, 'build/android')) - env['PATH'] = ':'.join(path_parts) - - os.chdir(webrtc_dir) - - sh('gclient sync -D', env) - - -def build(target_dir, platform, debug): - build_dir = os.path.join(target_dir, 'build', platform) - build_type = 'Debug' if debug else 'Release' - depot_tools_dir = os.path.join(target_dir, 'depot_tools') - webrtc_dir = os.path.join(target_dir, 'webrtc', platform, 'src') - - if not os.path.isdir(webrtc_dir): - print('WebRTC source not found, did you forget to run --setup?') - sys.exit(1) - - # Prepare environment - env = os.environ.copy() - path_parts = [env['PATH'], depot_tools_dir] - if platform == 'android': - # Same as . build/android/envsetup.sh - android_sdk_root = os.path.join(webrtc_dir, 'third_party/android_sdk/public') - path_parts.append(os.path.join(android_sdk_root, 'platform-tools')) - path_parts.append(os.path.join(android_sdk_root, 'tools')) - path_parts.append(os.path.join(webrtc_dir, 'build/android')) - env['PATH'] = ':'.join(path_parts) - - os.chdir(webrtc_dir) - - # Cleanup old build - rmr('out') - - # Run GN - if platform == 'ios': - for item in IOS_BUILD_ARCHS: - tenv, arch = item.split(':') - gn_out_dir = 'out/%s-ios-%s-%s' % (build_type, tenv, arch) - gn_args = GN_IOS_ARGS % (str(debug).lower(), arch, tenv) - gn_cmd = 'gn gen %s %s' % (gn_out_dir, gn_args) - sh(gn_cmd, env) - #for arch in MACOS_BUILD_ARCHS: - # gn_out_dir = 'out/%s-macos-%s' % (build_type, arch) - # gn_args = GN_MACOS_ARGS % (str(debug).lower(), arch) - # gn_cmd = 'gn gen %s %s' % (gn_out_dir, gn_args) - # sh(gn_cmd, env) - else: - for cpu in ANDROID_BUILD_CPUS: - gn_out_dir = 'out/%s-%s' % (build_type, cpu) - gn_args = GN_ANDROID_ARGS % (str(debug).lower(), cpu) - gn_cmd = 'gn gen %s %s' % (gn_out_dir, gn_args) - sh(gn_cmd, env) - - # Build with Ninja - if platform == 'ios': - for item in IOS_BUILD_ARCHS: - tenv, arch = item.split(':') - gn_out_dir = 'out/%s-ios-%s-%s' % (build_type, tenv, arch) - ninja_cmd = 'ninja -C %s framework_objc' % gn_out_dir - sh(ninja_cmd, env) - #for arch in MACOS_BUILD_ARCHS: - # gn_out_dir = 'out/%s-macos-%s' % (build_type, arch) - # ninja_cmd = 'ninja -C %s mac_framework_objc' % gn_out_dir - # sh(ninja_cmd, env) - else: - for cpu in ANDROID_BUILD_CPUS: - gn_out_dir = 'out/%s-%s' % (build_type, cpu) - ninja_cmd = 'ninja -C %s libwebrtc libjingle_peerconnection_so' % gn_out_dir - sh(ninja_cmd, env) - - # Cleanup build dir - rmr(build_dir) - mkdirp(build_dir) - - # Copy build artifacts to build directory - if platform == 'ios': - # Fat simulators (we need a single slice for both simulators) - simulators = [item for item in IOS_BUILD_ARCHS if item.startswith('simulator')] - tenv, arch = simulators[0].split(':') - gn_out_dir = 'out/%s-ios-%s-%s' % (build_type, tenv, arch) - - shutil.copytree(os.path.join(gn_out_dir, APPLE_FRAMEWORK_NAME), os.path.join(gn_out_dir, 'fat-' + APPLE_FRAMEWORK_NAME)) - out_lib_path = os.path.join(gn_out_dir, 'fat-' + APPLE_FRAMEWORK_NAME, 'WebRTC') - slice_paths = [] - for item in simulators: - tenv, arch = item.split(':') - lib_path = os.path.join('out/%s-ios-%s-%s' % (build_type, tenv, arch), APPLE_FRAMEWORK_NAME, 'WebRTC') - slice_paths.append(lib_path) - sh('lipo %s -create -output %s' % (' '.join(slice_paths), out_lib_path)) - - orig_framework_path = os.path.join(gn_out_dir, APPLE_FRAMEWORK_NAME) - bak_framework_path = os.path.join(gn_out_dir, 'bak-' + APPLE_FRAMEWORK_NAME) - fat_framework_path = os.path.join(gn_out_dir, 'fat-' + APPLE_FRAMEWORK_NAME) - shutil.move(orig_framework_path, bak_framework_path) - shutil.move(fat_framework_path, orig_framework_path) - - # dSYMs - shutil.copytree(os.path.join(gn_out_dir, APPLE_DSYM_NAME), os.path.join(gn_out_dir, 'fat-' + APPLE_DSYM_NAME)) - out_dsym_path = os.path.join(gn_out_dir, 'fat-' + APPLE_DSYM_NAME, 'Contents', 'Resources', 'DWARF', 'WebRTC') - slice_paths = [] - for item in simulators: - tenv, arch = item.split(':') - dsym_path = os.path.join('out/%s-ios-%s-%s' % (build_type, tenv, arch), APPLE_DSYM_NAME, 'Contents', 'Resources', 'DWARF', 'WebRTC') - slice_paths.append(dsym_path) - sh('lipo %s -create -output %s' % (' '.join(slice_paths), out_dsym_path)) - - orig_dsym_path = os.path.join(gn_out_dir, APPLE_DSYM_NAME) - bak_dsym_path = os.path.join(gn_out_dir, 'bak-' + APPLE_DSYM_NAME) - fat_dsym_path = os.path.join(gn_out_dir, 'fat-' + APPLE_DSYM_NAME) - shutil.move(orig_dsym_path, bak_dsym_path) - shutil.move(fat_dsym_path, orig_dsym_path) - - _IOS_BUILD_ARCHS = [item for item in IOS_BUILD_ARCHS if not item.startswith('simulator')] - _IOS_BUILD_ARCHS.append(simulators[0]) - - # XCFramework - xcframework_path = os.path.join(build_dir, 'WebRTC.xcframework') - xcodebuild_cmd = 'xcodebuild -create-xcframework -output %s' % xcframework_path - for item in _IOS_BUILD_ARCHS: - tenv, arch = item.split(':') - gn_out_dir = 'out/%s-ios-%s-%s' % (build_type, tenv, arch) - xcodebuild_cmd += ' -framework %s' % os.path.abspath(os.path.join(gn_out_dir, APPLE_FRAMEWORK_NAME)) - xcodebuild_cmd += ' -debug-symbols %s' % os.path.abspath(os.path.join(gn_out_dir, APPLE_DSYM_NAME)) - #for arch in MACOS_BUILD_ARCHS: - # gn_out_dir = 'out/%s-macos-%s' % (build_type, arch) - # xcodebuild_cmd += ' -framework %s' % os.path.join(gn_out_dir, APPLE_FRAMEWORK_NAME) - sh(xcodebuild_cmd) - sh('zip -r WebRTC.xcframework.zip WebRTC.xcframework', cwd=build_dir) - rmr(xcframework_path) - else: - gn_out_dir = 'out/%s-%s' % (build_type, ANDROID_BUILD_CPUS[0]) - shutil.copy(os.path.join(gn_out_dir, 'lib.java/sdk/android/libwebrtc.jar'), build_dir) - - for cpu in ANDROID_BUILD_CPUS: - lib_dir = os.path.join(build_dir, 'lib', ANDROID_CPU_ABI_MAP[cpu]) - mkdirp(lib_dir) - gn_out_dir = 'out/%s-%s' % (build_type, cpu) - shutil.copy(os.path.join(gn_out_dir, 'libjingle_peerconnection_so.so'), lib_dir) - - sh('jar cvfM libjingle_peerconnection.so.jar lib', cwd=build_dir) - rmr(os.path.join(build_dir, 'lib')) - sh('zip -r android-webrtc.zip *.jar', cwd=build_dir) - - -if __name__ == "__main__": - parser = argparse.ArgumentParser() - parser.add_argument('dir', help='Target directory') - parser.add_argument('--setup', help='Prepare the target directory for building', action='store_true') - parser.add_argument('--build', help='Build WebRTC in the target directory', action='store_true') - parser.add_argument('--sync', help='Runs gclient sync on the WebRTC directory', action='store_true') - parser.add_argument('--ios', help='Use iOS as the target platform', action='store_true') - parser.add_argument('--android', help='Use Android as the target platform', action='store_true') - parser.add_argument('--debug', help='Make a Debug build (defaults to false)', action='store_true') - - args = parser.parse_args() - - if not (args.setup or args.build or args.sync): - print('--setup or --build must be specified!') - sys.exit(1) - - if args.setup and args.build: - print('--setup and --build cannot be specified at the same time!') - sys.exit(1) - - if not (args.ios or args.android): - print('--ios or --android must be specified!') - sys.exit(1) - - if args.ios and args.android: - print('--ios and --android cannot be specified at the same time!') - sys.exit(1) - - if not os.path.isdir(args.dir): - print('The specified directory does not exist!') - sys.exit(1) - - target_dir = os.path.abspath(os.path.join(args.dir, 'build_webrtc')) - platform = 'ios' if args.ios else 'android' - - if args.setup: - setup(target_dir, platform) - print('WebRTC setup for %s completed in %s' % (platform, target_dir)) - sys.exit(0) - - if args.sync: - sync(target_dir, platform) - print('WebRTC sync for %s completed in %s' % (platform, target_dir)) - sys.exit(0) - - if args.build: - build(target_dir, platform, args.debug) - print('WebRTC build for %s completed in %s' % (platform, target_dir)) - sys.exit(0)