I found some examples on how to generate video with python by piping frames to external programs like ffmpeg or mencoder. All those snippets encode each frame as a given image format (png, jpg...) by using PIL, matplotlib.. to get them. Besides the dependency, building figures or encoding images can be quite slow. I finally found a way of piping raw numpy buffers as frames to mencoder. You can use the following code:
import subprocess
import numpy as np
class VideoSink(object) :
def __init__( self, size, filename="output", rate=10, byteorder="bgra" ) :
self.size = size
cmdstring = ('mencoder',
'/dev/stdin',
'-demuxer', 'rawvideo',
'-rawvideo', 'w=%i:h=%i'%size[::-1]+":fps=%i:format=%s"%(rate,byteorder),
'-o', filename+'.avi',
'-ovc', 'lavc',
)
self.p = subprocess.Popen(cmdstring, stdin=subprocess.PIPE, shell=False)
def run(self, image) :
assert image.shape == self.size
# image.tofile(self.p.stdin) # should be faster but it is indeed slower
self.p.stdin.write(image.tostring())
def close(self) :
self.p.stdin.close()
It is a tenth faster than the PIL based one. And there is no fair comparison to the matplotlib one. I got it working with mencoder but I could not figure out how to make it with ffmpeg.
You can find that code with an usage example there. Being based on other public domain snippets consider it also public domain.
I am using that class to save the output of my Freenect based project. More on that on next entries.
1 comment:
Nice! I was looking for this.
Post a Comment