[Update] Video size guide when uploading

Hello! :blush:

A new post to tell you that we’ve improved our documentation regarding the video size information.

:point_right: For small files (up to 128Mb.), you can follow this tutorial


:point_right:For the bigger ones, you can use the split utility to split the video into smaller chunks

split --bytes=100M source.mp4 file_chunk_

All details are in our documentation here.

Don’t hesitate to give us your feedback!

1 Like

How are you supposed to split a file if it comes from a file input on a website?

Hi!

Thanks for the question. The issue that arises with copying large files is that if there is any transport error, the upload will fail. The reason we recommend smaller byte range uploads is to minimise the time spent with failed uploads. You could try to upload the large video using the source attribute, and it may work (assuming no transport issues).

However, if the full upload fails, you’ll need to copy the video, break it into chunks, and then upload the segments, as outlined in the documentation & the tutorial (docs.spi.video)

Doug

Right, so is it possible to split a file using JavaScript within a website? Sounds like splitting the file anywhere other than the users’ local machine is not possible because you’d just encounter the same issue of sending a large file somewhere else.

Great questions:

If you have file access on the server, file.slice might do what you are looking for:

In regards to downloading/slicing/uploading: HTTP has issues UPLOADING large files, but not DOWNLOADING them. So the process of copying to your machine (or server) should be fine. Then you can split & upload the segments.

Doug

I am referring to uploading videos on a website within a browser.

  async ({ videoId }) => {
    const chunkSize = 125000000; // 125 MB
    for (let from = 0; from < videoFile.size; from += chunkSize + 1) {
      const to = Math.min(videoFile.size - 1, from + chunkSize);
      const chunk = videoFile.slice(from, to);
      const formData = new FormData();
      formData.append('data', chunk);
      await fetch(`https://sandbox.api.video/videos/${videoId}/source`, {
        method: 'POST',
        headers: {
          Authorization: `Bearer ${videoAPIAccessToken.current}`,
          'Content-Range': `bytes ${from}-${to}/${videoFile.size}`,
        },
        body: formData,
      });
    }
  })

This is what I have right now, it uploads files in chunks, there is no server error, and a video container is created. However, if the video is sent in multiple chunks (i.e. file size is greater than chunkSize), the uploaded video on the dashboard will just show the 5 second “This video is being processed…” placeholder. If the uploaded video file size is less than chunkSize, the video is uploaded properly and you can watch it online. Can you spot the issue in my code?

I guess could you just share the code snippet used to upload video files on “https://go.api.video/”? That works well and uploads videos a lot faster than my code for some reason.

1 Like

Here is the code we use on go.api.video. it should help you get started:

    <h1>Chunked</h1>
<form id="upload-form" action="https://ws.api.video/upload?token=YOU_UPLOAD_TOKEN" method="post">
    <input type="file" name="file">
    <input type="submit">
</form>
<script>
    var CHUNK_SIZE = 60 * 1024 * 1024; // 60MB size of one chunk
    function upload(file, uploadUrl, onSuccess, onError) {
        var chunkCount = Math.ceil(file.size / CHUNK_SIZE);
        function uploadChunk(file, chunkNum, videoId) {
            var from = CHUNK_SIZE * chunkNum;
            var blob = file.slice(from, from + CHUNK_SIZE);
            var formData = new FormData();
            if (videoId) {
                formData.append("videoId", videoId);
            }
            formData.append("file", blob, file.name);
            var xhr = new XMLHttpRequest();
            xhr.onload = function () {
                if (this.status >= 300) {
                    return onError(chunkNum, chunkCount, this);
                }
                videoId = JSON.parse(xhr.responseText).videoId;
                if (onSuccess) {
                    onSuccess(chunkNum, chunkCount, videoId, this);
                }
                if (chunkNum < chunkCount - 1) {
                    uploadChunk(file, chunkNum + 1, videoId);
                }
            };
            xhr.onerror = function () {
                if (onError) {
                    onError(chunkNum, chunkCount);
                }
            };
            xhr.open("POST", uploadUrl, true);
            xhr.setRequestHeader("Content-Range", "bytes " + from + "-" + (from + blob.size - 1) + "/" + file.size);
            xhr.send(formData);
        }
        uploadChunk(file, 0);
    }
    document.getElementById("upload-form").onsubmit = function (e) {
        e.preventDefault();
        upload(
            this.elements[0].files[0],
            this.action,
            function (chunkNum, chunkCount, videoId, xhr) {
                console.log('Success!', chunkNum, chunkCount, videoId);
            },
            function (chunkNum, chunkCount, xhr) {
                console.log('Error.', chunkNum, chunkCount, xhr.status);
            }
        );
    };
</script>
2 Likes

We have SUCCESSFULLY implemented API.video into our commercial product and have uploaded files as large as 12 gig using chunking as described. It DOES work and works well

4 Likes

that goot if I understand hehe thanks

if you can throw it into my chat room in the broadcast room