Summary: | Force GLMediaPlayer not to depend on machine time | ||
---|---|---|---|
Product: | [JogAmp] Jogl | Reporter: | Ertong <spam> |
Component: | video | Assignee: | Sven Gothel <sgothel> |
Status: | CONFIRMED --- | ||
Severity: | enhancement | CC: | spam |
Priority: | P4 | ||
Version: | 3.0.0 | ||
Hardware: | All | ||
OS: | all | ||
Type: | FEATURE | SCM Refs: | |
Workaround: | --- |
Description
Ertong
2013-12-29 23:00:39 CET
I see you have reused the same description and I don't fully understand your use-case. The player shall give you the real time frame at the given time of calling getNextFrame(), i.e.: display: 100 fps video: 20 fps display call # 100 @ 1s -> video frame # 20 display call # 200 @ 2s -> video frame # 40 Hence video frames will be redisplayed 'naturally' to keep the video in sync with real time. This use case is already working properly. Please elaborate your use case, +++ For editing or using different time base all we can do here IMHO is to not consider time at all @ getNextFrame() and simply deliver the next consecutive frame. The time of this frame will be included in the TextureFrame itself, hence the caller can take care of actions here. One problem with this approach maybe audio, since audio playback will always block until buffers become available - if used. We may either need a diff. AudioSink (memory only) or disable it. The above will be mostly usable for editing IMHO, but should be feasible for other 'time base issues' as well ? +++ I'll try to describe use case in different words. I'm trying to render OpenGL frames to different video. For example, I have a movie, play it on the plane, and save the result to new video. As a result, I'm interesting only in the resulting video. The rendering itself can be done even offscreen. In this case: Source video: 25 fps. Rendering: 100 fps. Destination video: 25 fps. If you tie the movie to 100 fps, while playing the destination video, original one will be played 4 times faster. ---- >>is to not consider time at all @ getNextFrame() >>and simply deliver the next consecutive frame. As far as I understand, you mean getNextTexture(). Hm ... I didn't know that TextureSequence.TextureFrame has timestamp. It will help. But it will be really useful to have time-independent implementation of getNextTexture() >>One problem with this approach maybe audio In this case it is not necessary to keep audio in sync while playing. At this stage, I'm trying to manage audio afterwards. So, audio will be turned off while rendering. But this is only my case, maybe somebody will have different needs. For example, it could be useful to enumerate audio frames along with video ones. (In reply to comment #2) > I'll try to describe use case in different words. > > I'm trying to render OpenGL frames to different video. > For example, I have a movie, play it on the plane, and save the result to > new video. As a result, I'm interesting only in the resulting video. The > rendering itself can be done even offscreen. > > In this case: > Source video: 25 fps. > Rendering: 100 fps. > Destination video: 25 fps. > > If you tie the movie to 100 fps, while playing the destination video, > original one will be played 4 times faster. The current player would render 25 frames per seconds if GL rendering is above, i.e. 100 fps. So you are saying 'you want to play it 4 times faster, i.e. at maximum renderable speed' ? (As described in my form reply .. 'video editing'). Sorry for my lack of understanding your case here. > > ---- > >>is to not consider time at all @ getNextFrame() > >>and simply deliver the next consecutive frame. > As far as I understand, you mean getNextTexture(). > Hm ... I didn't know that TextureSequence.TextureFrame has timestamp. > It will help. > But it will be really useful to have time-independent implementation of > getNextTexture() Surely we could simply turn off time based a/v sync, then 'getNextTexture()' would deliver the next frame until it becomes available, i.e. blocking. We may turn off the StreamWorker thread here. There will be always a 'time base' in motion pictures, i.e. at least the sequence between I-frames, of course. Hence a method like 'getNext(long time)' will not be too useful, as your experimentation w/ seek turned out .. > > >>One problem with this approach maybe audio > In this case it is not necessary to keep audio in sync while playing. > At this stage, I'm trying to manage audio afterwards. > So, audio will be turned off while rendering. > But this is only my case, maybe somebody will have different needs. For > example, it could be useful to enumerate audio frames along with video ones. >>Surely we could simply turn off time based a/v sync,
>>then 'getNextTexture()' would deliver the next frame
>>until it becomes available, i.e. blocking. We may turn
>>off the StreamWorker thread here.
This ability will be enough for my case.
Thanks.
And in addition to that ... can you implement blocking version of seek()? |