React native buffer for live streaming?

I’m using LiveStreamView from “@api.video/react-native-livestream” to live stream from a device and upload to an rtmp server; however, it seems that if the connection is poor on the device that video will drop from the upload. Is it possible to buffer the upload before it sends to the server? this way that the video sent is smooth or even in the event that the connection is poor?

Hi jeric,
If you start to buffer frames because of a poor connection, this will only lead to more buffering. I guess you mean increase delay/latency. Well, if the connection stays poor, it just postpones the issue for a while. At some point, you will have to send the buffered frames and if your connection is still poor, that won’t do it. Moreover, RTMP is not naturally suited for that.

Losing frames in poor connection is a totally normal behavior of live streaming. If you are not ready for it, could you consider uploading video instead of live streaming?

The only option for poor connection is to reduce video and/or audio bitrates.

Best regards,
Thibault

thibault1,

Thank you for the reply! I apologize for any misuse of nomenclature, I’m a bit of a novice in this particular area. What you’re saying makes sense about increasing the delay leading to more buffering; and, effectively no longer making the video “live”.

However, the client has accepted this risk and is willing to accept that the cost of higher res uploads will diminish the “live” experience.

The end goal is to live stream at a high res (even at the cost of a delay in the stream).

I think that recording and uploading later could be a good option, or maybe even streaming as is, and then overriding the stream data with the recorded data after the broadcast ends. I’m not sure how intensive that would be for a mobile device to be streaming/recording at the same time…

I brought up variable bitrate; however, it was reiterated that quality > live was the priority.

That being said, do you have any recommendations that would delay the upload or buffer it? Is it possible to add a delay time with the current implementation of the library?

Outside of this tooling, I considered maybe using expo camera to capture the data and then ffmpeg to fragment it, and then send it in chunks at a time to an rtmp server… not quite sure :sweat_smile:

Hi,

There is no way to set latency in the current version of the RN library.

Do you really need live experience? If not, you could consider using api.video progressive upload. We have been working for a while on mobile libraries that record the camera and microphone streams and at the same time, progressively upload them. We call them upstream. The Android version has been released on github but need a little upgrade.

yes! this is exactly what I need, is there any timetable for the IOS version? or a branch that can be looked at and contributed to?

One of the differences between progressive upload and live stream is that you won’t be able to play the video till all parts of the file have been sent. But if your focus is on quality and you don’t have live interaction with users, it is the best solution.

The iOS upstream is on a private project on github. If you could provide your github account or email, we will invite you to the project.

here is my github, appreciate the invite! this seems like a perfect use case for what we’re trying to do. Its as you said, if we can have it broadcasted “live” thats great, but mainly they are focused on quality.

Ok. You have been added as a reader of the project.
You can already launch the example and upload the video to api.video.

The Swift upstream is still missing few features that we plan to tackle:

  • upload in background
  • create an upload session to reupload parts that have not been sent.
1 Like