digitalmars.D.learn - better video rendering in d
- monkyyy (9/9) Mar 21 2023 My current method of making videos of using raylib to generate
- H. S. Teoh (15/18) Mar 21 2023 [...]
- monkyyy (9/17) Mar 21 2023 I vaguely remember an hour and half for 5 minutes of video when
- H. S. Teoh (9/29) Mar 21 2023 You could try to feed the frames to ffmpeg over stdin instead of storing
- Ferhat =?UTF-8?B?S3VydHVsbXXFnw==?= (4/24) Mar 21 2023 This is how I use pipe process with d and ffmpeg. I am reading
- Guillaume Piolat (9/18) Mar 24 2023 Hi,
- Guillaume Piolat (4/12) Mar 24 2023 Fixed the URLs.
- Ogi (3/6) Mar 25 2023 Why not use ffmpeg as a library? Here are the
My current method of making videos of using raylib to generate screenshots, throwing those screenshots into a folder and calling a magic ffmpeg command is ... slow. Does anyone have a demo or a project that does something smarter (or willing to do the busy work of finding the right combo of dependencies that just work) I require basic images, text, and transparent rectangles https://youtu.be/HxFSmDNvDUI ideally raylib or image magik for the frame generation
Mar 21 2023
On Tue, Mar 21, 2023 at 04:57:49PM +0000, monkyyy via Digitalmars-d-learn wrote:My current method of making videos of using raylib to generate screenshots, throwing those screenshots into a folder and calling a magic ffmpeg command is ... slow.[...] How slow is it now, and how fast do you want it to be? One possibility is to generate frames in parallel... though if you're recording a video of a sequence of operations, each of which depends on the previous, it may not be possible to parallelize. I have a toy project that generates animations of a 3D model parametrized over time. It generates .pov files and runs POVRay to generate frames, then calls ffmpeg to make the video. This is parallelized with std.parallelism.parallel, and is reasonably fast. However, ffmpeg will take a long time no matter what (encoding a video is a non-trivial operation). T -- Try to keep an open mind, but not so open your brain falls out. -- theboz
Mar 21 2023
On Tuesday, 21 March 2023 at 17:18:15 UTC, H. S. Teoh wrote:On Tue, Mar 21, 2023 at 04:57:49PM +0000, monkyyy via Digitalmars-d-learn wrote:I vaguely remember an hour and half for 5 minutes of video when its extremely lightweight and raylib trivially does real-time to display it normally and realistically I wouldn't be surprised if it could do 1000 frames a second. Coping several gb of data to disk(that probably asking the gpu one pixel at a time) to be compressed down into a dozen mb of video is just... temp shit. I should just do something that isnt stressing hard drives extremely unnecessarily.My current method of making videos of using raylib to generate screenshots, throwing those screenshots into a folder and calling a magic ffmpeg command is ... slow.[...] How slow is it now, and how fast do you want it to be? T
Mar 21 2023
On Tue, Mar 21, 2023 at 05:29:22PM +0000, monkyyy via Digitalmars-d-learn wrote:On Tuesday, 21 March 2023 at 17:18:15 UTC, H. S. Teoh wrote:You could try to feed the frames to ffmpeg over stdin instead of storing the frames on disk. See this, for example: https://stackoverflow.com/questions/45899585/pipe-input-in-to-ffmpeg-stdin Then you can just feed live data to it in the background while you generate frames in the foreground. T -- Lottery: tax on the stupid. -- SlashdotterOn Tue, Mar 21, 2023 at 04:57:49PM +0000, monkyyy via Digitalmars-d-learn wrote:I vaguely remember an hour and half for 5 minutes of video when its extremely lightweight and raylib trivially does real-time to display it normally and realistically I wouldn't be surprised if it could do 1000 frames a second. Coping several gb of data to disk(that probably asking the gpu one pixel at a time) to be compressed down into a dozen mb of video is just... temp shit. I should just do something that isnt stressing hard drives extremely unnecessarily.My current method of making videos of using raylib to generate screenshots, throwing those screenshots into a folder and calling a magic ffmpeg command is ... slow.[...] How slow is it now, and how fast do you want it to be? T
Mar 21 2023
On Tuesday, 21 March 2023 at 17:46:00 UTC, H. S. Teoh wrote:On Tue, Mar 21, 2023 at 05:29:22PM +0000, monkyyy via Digitalmars-d-learn wrote:This is how I use pipe process with d and ffmpeg. I am reading video frames but other way may work too. https://github.com/aferust/oclcv/blob/main/examples/rgb2gray-video/source/app.dOn Tuesday, 21 March 2023 at 17:18:15 UTC, H. S. Teoh wrote:You could try to feed the frames to ffmpeg over stdin instead of storing the frames on disk. See this, for example: https://stackoverflow.com/questions/45899585/pipe-input-in-to-ffmpeg-stdin Then you can just feed live data to it in the background while you generate frames in the foreground. T[...]I vaguely remember an hour and half for 5 minutes of video when its extremely lightweight and raylib trivially does real-time to display it normally and realistically I wouldn't be surprised if it could do 1000 frames a second. Coping several gb of data to disk(that probably asking the gpu one pixel at a time) to be compressed down into a dozen mb of video is just... temp shit. I should just do something that isnt stressing hard drives extremely unnecessarily.
Mar 21 2023
On Tuesday, 21 March 2023 at 16:57:49 UTC, monkyyy wrote:My current method of making videos of using raylib to generate screenshots, throwing those screenshots into a folder and calling a magic ffmpeg command is ... slow. Does anyone have a demo or a project that does something smarter (or willing to do the busy work of finding the right combo of dependencies that just work) I require basic images, text, and transparent rectangles https://youtu.be/HxFSmDNvDUI ideally raylib or image magik for the frame generationHi, The idea to pipe stdout to ffmpeg is sound. In the following dead repo: https://github.com/p0nce/y4m-tools you will find a tool that capture a shader, format it into Y4M and output on stdout. Y4M output is useful because it embeds the metadata unlike .yuv See: https://github.com/p0nce/y4m-tools/blob/master/shader-capture/example.sh
Mar 24 2023
On Friday, 24 March 2023 at 15:41:36 UTC, Guillaume Piolat wrote:Hi, The idea to pipe stdout to ffmpeg is sound. In the following dead repo: https://github.com/p0nce/y4m-tools you will find a tool that capture a shader, format it into Y4M and output on stdout. Y4M output is useful because it embeds the metadata unlike .yuv See: https://github.com/p0nce/y4m-tools/blob/master/shader-capture/example.shFixed the URLs. How to output .y4m => https://github.com/p0nce/y4m-d/blob/master/source/y4md/package.d
Mar 24 2023
On Tuesday, 21 March 2023 at 16:57:49 UTC, monkyyy wrote:My current method of making videos of using raylib to generate screenshots, throwing those screenshots into a folder and calling a magic ffmpeg command is ... slow.Why not use ffmpeg as a library? Here are the [bindings](https://code.dlang.org/packages/ffmpeg-d).
Mar 25 2023