I've been working on creating a simple screen recording software, and in this post, I share how I finally figured out how to record both microphone and desktop audio simultaneously. Previously, I could only record one or the other. The key is to use the Web Audio API, specifically createMediaStreamSource and createMediaStreamDestination, to combine the two audio streams into one. This combined stream can then be fed into the MediaRecorder API. You can check out the full code on my Glitch project and see a demo, too!
I've encountered a bug in Chrome on Android where MediaRecorder, using Canvas.captureStream(), fails to encode video from large canvas elements (e.g., 1280x720). While the process works on desktop Chrome, on Android, the recording stops abruptly at unpredictable points, likely due to limitations in the MediaRecorder API's encoding capabilities. A smaller canvas (640x480) works fine, suggesting resolution-based limitations. I've reported this as Chrome bug 897727 and created a demo to illustrate the issue.
In this post, I'm sharing a screencast demonstrating how I built a web-based webcam and screen recorder using the navigator.getDisplayMedia API. This allows users to grant access to their screen content for recording. The code provided captures both screen and audio, combines them into a single video stream, and allows downloading the recorded video as a webm file. This is a very early stage, and the current output is raw. The ultimate goal is to build a full-fledged video editor in the browser, but for now, this screencast shows the initial steps in capturing video and audio.
While building a web-based video editor, I encountered an issue with handling multiple video tracks in a MediaStream. I wanted to switch between different video sources (desktop and webcam) on a single video element without interrupting the MediaRecorder. Attempting to do this by toggling the 'selected' property on the videoTracks object of the video element failed. The videoTracks array only contains the first video track added to the MediaStream, even though the stream itself contains both tracks. This prevents seamless switching between sources within the video element.
I'm embarking on a project to build a web-based video editor! The goal is to create a tool that simplifies video creation and editing entirely within the browser. Think Screenflow, but accessible to everyone directly on the web. This project is driven by my own needs for creating device demos, screencasts, and other videos. I've already made some progress (check out the demo!), but there's a lot more to do. I'll be exploring existing web technologies to record audio/video, manipulate content (watermarks, filters, overlays), and output in various formats. This isn't about building a massive commercial product, but rather about understanding what's possible and empowering others to create great videos using the open web.