Let’s assume that our server is supposed to be spun up in a Docker container. Run FFmpeg as a command on the NodeJS server For example, this one concatenates two videos in1.avi and in2.avi in one video with the name out.avi > ffmpeg -i in1.avi -i in2.avi -filter_complex concat out.avi 1.2. If everything is OK, you may create specific commands. Run FFmpeg as a command in the terminal of the local machineįirst, you need to install FFmpeg on your local machine (to find out more, see this link).Īfter installation, make sure that FFmpeg is available: ffmpeg -version Let’s have a look at some of them that might be interesting. There are a lot of ways to use this framework. ![]() end cover - this chunk will be prepared by Motion Designer, and we don’t need to do any modifications, just concatenate it to the final video (pic.recorded chunk - we need to put a logo watermark inside and adjust it to fit the expected resolution 9x16 (pic.cover with a question - before each recorded chunk and we need to incorporate a question there (pic.cover with the title - where we need to incorporate the user’s first and last name (pic.We need to create one video composition from all recorded chunks and put some covers before each chunk (picture 1): It includes two recorded MP4 format videos with a resolution of 1280x720 (it may be delivered in WebM format too, but it doesn't matter since FFmpeg can easily convert video from one format to another). Here I skipped the implementation of the FE part and put recorded video from the camera in the server directly /src/media/camera. Well, the original task is as follows - user should record their answers to some interview questions frontend (FE) then sends these video chunks to the server with metadata associated with each of them (in our case, it’s the question itself and the user’s first and last name). Before using components of FFmpeg in any commercial projects you should consult with the legal team and study this legal section of the FFMPEG website Pay attention that FFmpeg is under GNU Lesser General Public License (LGPL) version 2.1 or later. The project is tested with FFmpeg version 4.4.1 and NodeJS version 16.15.0.īefore we start, I strongly recommend reading this article, as it provides a common overview of the FFmpeg syntax and explains how filters work since filters are the heart of this framework and we are going to use them further. There are instructions in the README file on how to run and use the server. The whole code will be available in the GitHub repository which you may find here. In this article, we are going to implement a specific project and gain enough knowledge to use on your personal projects afterward. We as engineers just need to review documentation and create a special command. It provides a special Command Line Interface (CLI) and looks like it can complete everything that is needed. One of those frameworks is called FFmpeg. ![]() Lucky for us, some frameworks aim to simplify this process and provide a declarative way to express our intention in which way we need to process a video. At first sight, it may seem like a pretty difficult topic that requires special knowledge and education. Video processing in applications is today’s mainstream, and we as developers should follow it.
0 Comments
Leave a Reply. |