streamaserver / streama

Self hosted streaming media server. https://docs.streama-project.com/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Transcoding and Dash support

nherbaut opened this issue · comments

I noticed that streama didn't support file transcoding.
Is it your backlog? If not, can you elaborate on how to integrate it from a technical point of view?
I have a project that does transcoding and generates dash segments that works well with dash.js maybe I can work on a way to integrate it.

That would be absolutely fantastic! I was thinking of implementing a node-js runner that uses idle system resources to convert files in the background and make them available to streama when rdy, otherwise they are queued. But immediate transcoding would be even better. Its just not something that I know very much about

On-the-fly transcoding is something I've been thinking about for a while, but have been working on other issues.

My plan was to create a wrapper around FFMPEG to handle the transcoding. When the video player requests the file, return the transcode.

@Jeronimo95
It depends what are the system requirements for the project. It might be resource hungry. For example, I've deployed ampache which performs simple audio transcoding with ffmpeg, and it's already bringing my VPS down to its knees... without proper GPU acceleration, it may not even be feasible to transcode 1 stream in real time. Amazon does propose GPU-accelerated instances that can be used, but the hosting price may not be what the final user expects, especially if the GPU instance is idle most of the time

@dularion what are the possibilities for technical integration?
for example, I have a poc (https://github.com/nherbaut/vhg-adaptation-worker) that runs async transcoding and dash chunking tasks that use a transcoding workflow implemented in python and working with with an amqp message broker to accommodate several parallel workers for scalability.
I could wrap it up in a stand-alone docker microservice with a REST api and tell streama to process the transcoding depending on the current load of the system. We also have the java connectors that can pilot the message brokers (we use spring btw)

Another option would be to have a 100% pure java code, relying on ffmpeg or gstreamer under the hood for transcoding

Let me know what would work best

We could build the transcoder to do both pre-transcoding and live-transcoding. Then make it selectable for the user, by default live-transcoding the files so users don't have to think about it on a new install. We could have options like:

  • No Transcoding
  • Live Transcoding
    • Transcode Quality
    • Max Simultaneous Streams
  • Transcode on upload
    • Delete files after transcode
    • Transcode previously upload files
  • Transcode local files folder
    • Output folder (Same location / Different location)

Then the user would have the power to select what they wanted. For something like a small VPS - use the "Transcode on upload" or "No Transcoding". For a home server with a powerful CPU use "Live Transcoding".
The non-live options could be given a lower priority (or have selectable priority) so as to not kill other applications on the system (including streama).

If you take a look at the processes when running a Plex transcode it has the following options, which look similar to FFMPEG, I wouldn't be surprised if they were using it as a base.

/usr/lib/plexmediaserver/Plex Transcoder
-codec:0 h264 -codec:1 ac3 -noaccurate_seek -i 
/mnt/nas01/media/TV/Rick and Morty/Season 03/Rick and Morty S03E09 The ABC's of Beth.mkv
-map0:0 -codec:0 copy -map0:1 -codec:1 aac -ar:148000 -channel_layout:1stereo -b:1256k -f segment
-segment_format matroska -segment_format_options live=1 -segment_time 1 
-segment_header_filenameheader -segment_start_number 0 
-segment_list http://127.0.0.1:32400/video/:/transcode/session/ipq0ylx83ajhun6gkwiuaqsq/75278953-211e-4050-8834-dea2dde5b653/seglist
-segment_list_type csv -segment_list_size 2147483647 -avoid_negative_ts disabled -map_metadata -1 
-map_chapters -1 chunk -%05d -start_at_zero -copyts -y -nostats -loglevel quiet -loglevel_plex error
-progressurl http://127.0.0.1:32400/video/:/transcode/session/ipq0ylx83ajhun6gkwiuaqsq/75278953-211e-4050-8834-dea2dde5b653/progress

From this we can sort of tell that passing the video/segments/progress/etc back to Plex via http://127.0.0.1:32400/video/:/transcode/

As for system requirements it depends on how many streams the user wants to do at once. A Core 2 Duo 2.0 GHz should be able to do a single 720p transcode. I previously had a home server running a
intel G1840 that could do two-three transcode playbacks.

We should make it as easy for the end-user as possible, I think a solution that is contained within streama would be best and make it the most flexible.

Will this transcoder support multiplexing video? No need to actually transcode a file that is already in H264, sometimes an MKV container just needs to be changed over to an MP4.

This basically just takes way less time and CPU power while changing the file type and not reducing the video quality as far as I understand it.

Yeah will definitely do that. If it's just the container that needs to be changed no sense to completely decode and reencode the data.

Also, heavilly requested feature lacking from plex and other platforms is proper ordered chapters support. This allows TV series that ship with the same intro save space by only storing the intro/ending sequence once, and just playing that in a set time on the other MKVs

Also allows easilly autoskipping intros, and playing only the intros.

commented

anything new in transcoding support?

Nothing really "new" but I really want focus on figuring it out.

I think what I'm going to try is to spawn an FFMPEG process to write a temp file in the correct format when the file is requested and can't be directly played by the device. That way we can support multiple devices and qualities and don't have to store the same video multiple times. You'll just need a few gigs of space to hold the temp file.

Okay so what I'm looking at at the moment is to integrate hls.js and use ffmpeg to generate a hls stream. This will be compatible with:

  • Chrome for Android 34+
  • Chrome for Desktop 34+
  • Firefox for Android 41+
  • Firefox for Desktop 42+
  • IE11+ for Windows 8.1+
  • Edge for Windows 10+
  • Opera for Desktop
  • Vivaldi for Desktop
  • Safari for Mac 8+ (beta)

I also want to do detection for direct play videos - so we don't have to create a hls stream if we don't need to.

When I have a working alpha I'll push it to a branch.

Any update on this? Happy to help test it out if there is.

At the risk of being annoying: are there any news on this? I'd really LOVE to see this feature in streama!

I know this is a very highly sought after feature. I had some tests I was working on back in March - nothing close to working. Unfortunately due to some bad hardware I don't have those anymore, not that they were incredibly useful.
Between work and personal issues I've haven't had the time or motivation to do much programming, let alone dive into this huge feature.
I'm really sorry and I do hope to get back to working on streama much more. I have been doing some work on the documentation, something that streama is very lacking in; I hope to publish that this weekend.

If anyone wants to help with this feature specifically get in touch with me.

vmaf is used to validate media visual quality, photon from my understanding is to validate IMF which is a containerized media, which probably isn't what most cases need for this software.

Hi all, I have some ideas regarding transcoding. I think that we have to be realistic and admit that implementing transcoding as part of streama is a bad idea because:

  1. This is an open-source project with a few core devs and implementing this feature is a huge challenge. It can't be done (in reasonable time and at reasonable quality) in these circumstances. There are more important issues that actually have a chance of being implemented.
  2. Most of the people that host streama are probably using relatively cheap ($5-10) VPS instances and transcoding on a low level hardware will be extremely slow, especially if you're going to use better x264 settings.
  3. Encoding has a lot of different parameters and things could get complicated quite fast. There is no "one preset rules them all" because not all content will look good enough using one preset. My point is implementing transcoding would probably make content look worse thant it could be if it was done "manually" by trial and error approach.
  4. Encoding require even more dependencies on the server.

So instead of having transoding feature I suggest having the ability to upload multiple videos of the same movie/episode but with different size and qulaity. For instance, you could upload 3 files for only one movie so you have high, medium and low quality uploads for the same movie. It's much easier to implement this, it doesn't choke low-end VPS instances, you have full control of encoder and encoding settings.

What do you think?

I'm not a Streama contributor by any means, but I'm a user and have followed this issue for a while. I do not wish to add noise to the issue but I thought @IvanBernatovic's comment was worth responding to since I believe my use case can't be that uncommon.

This is an open-source project with a few core devs and implementing this feature is a huge challenge. It can't be done (in reasonable time and at reasonable quality) in these circumstances. There are more important issues that actually have a chance of being implemented.

Valid point. But not an issue that I believe should remove the item from the roadmap altogether since someone might come by and want to scratch an itch. The bulk of the work isprobably already handled by third-party libraries such as ffmpeg so it would mostly be about integrating it into Streama in a user-friendly fashion and not breaking current file handling/serving code (I assume, since I'm not familiar with the code base).

Most of the people that host streama are probably using relatively cheap ($5-10) VPS instances and transcoding on a low level hardware will be extremely slow, especially if you're going to use better x264 settings.

I was actually surprised to hear that someone would use a VPS for hosting. I'd assume that the extreme amounts of storage and throughput would make cloud hosting very cost inefficient. I run it on an old server with tons of storage in a closet and my assumption would be that most people would have them stored on a NAS or something.

Encoding has a lot of different parameters and things could get complicated quite fast. There is no "one preset rules them all" because not all content will look good enough using one preset. My point is implementing transcoding would probably make content look worse thant it could be if it was done "manually" by trial and error approach.

Of course would the suggested approach be the optimal to go by quality wise, but I don't think anyone wishing for this feature are wanting this to provide a top-notch streaming service, it's a convenience feature, otherwise they'd already convert the files into correct file format and codec. I have a big old DVD library that I digitalized, all in MKV, that I'll never have any time or interest in manually transcoding and reupload. It's a convenience feature, I don't care about possibly squeezing out a higher quality stream, I just want A stream.

Encoding require even more dependencies on the server.

Pretty sure all you really need to add is ffmpeg if you want to go for a straight forward implementation.

I know this is ultimately up to the devs but I thought this alternative use case where hardware performance isn't necessarily an issue should be presented as well. I believe there are good reasons to support built-in transcoding.

something new here?