Uploading Extremely Large Files, The Awesome Easy Way [Tus Server]

Do you want your users to upload large files? .. I mean real large files, like multiple GB video .. Do your users live in places with a bad or unstable connection? .. If your answer is “yes”, then this article is exactly what you are looking for.

Background

At Chaino Social Network, we do care about our users, their feedback is our daily fuel, and a smooth enhanced UX is what we seek in everything we do for them. They asked for a video uploads feature, we made it for them .. they asked for a smaller processing time, we made it too .. they asked to upload a larger video files (up to 1 GB), again we made our precautions and increased the size to 1 GB .. but then we felt like hitting a wall, when we were swarmed by users’ feedback complaining that video uploads is easily interrupted by the bad networking, and the problem gets worse and worse when the files gets bigger and bigger where it becomes more vulnerable to interruptions and failures .. that is when we started our hunt for a solution. but let’s first see the current way of uploading.

Problem with normal way of Uploading

Our normal way of uploading is basically using the change event to start validating the file then uploading it as a multipart request where nginx does all the work for us then delivers the file path to our php backend where we can start processing the video, like the following:

As you can see, both client & server sides expect the file to be sent in one shot no matter how big it is, but in the real world, networks get interrupted all the time, which forces the user to upload the file over & over again from the beginning! , which is super frustrating with large files like videos!

In our hunt for a solution, At first we found some honorable mentions like Resumable.js, they didn’t really introduce a complete solution because most of these solutions focus only on the client side, where the server side is actually the real challenge! but then we found the only real complete solution out there Tus.io which was beyond our dreams!

Solution .. The Awesome One

Now we need a solution who can make 2 things:

  • Protect our users from network interruptions, where the solution should be able to automatically retry sending the file until network is hopefully stable again.
  • Gives our users a resumable upload in case of total network failures; so after the network comes back, users can continue uploading the file again where they left off.

I know these requirements seem like a dream, but this is why Tus.io is so awesome in so many levels, simply this is how it works:
How tus.io can increase both speed and reliability of file uploads
(Illustration by Alexander Zaytsev)

Tus is being adopted & trusted by many products now like Vimeo.

Now, if we are going to cook this awesome meal, let’s first list our ingredients:

  • TusPHP for server side
    • We used php library, but actually you can use other server side implementations like node.js, Go or others
  • tus-js-client for client side
    • TusPHP already has a php client, but I think we all would agree to use the javascript implementations for the client over the php ones.

Getting Started
Let’s install TusPHP (for server-side) using composer:

composer require ankitpokhrel/tus-php

and installing tus-js-client (for client-side) using npm:

npm install tus-js-client

& here are the basic usage for them:

I tried this combination locally & everything worked like a charm. But then we deployed the solution to our Beta servers, and this is when the panic beginsπŸ™ˆ.

Production Shocks

I know we deployed on Beta servers only, but let’s face it, most of us expect Beta to be like 5 minutes away from deploying on production .. which wasn’t our case πŸ™‚. So, these are the problems we faced after using a real production-like environment:

Permission Denied for File Cache
Remember we used File Cache before for simplicity, well, the library expects you to pass a configuration for the cache file’s path or it will just make cache file inside the library’s folder in the vendor folder which gives us a Permission Denied error for trying to write to the vendor folder without the right permissions, so let’s just pass the right configuration with a path accessible by all server’s users like the /tmp folder (no need for keeping a long term cache, after all the cache won’t exceed 24 hours per file by design), and here is how you can do so:

TusPhp\Config::set([
    /**
     * File cache configs.
     *
     * Adding the cache in the '/tmp/' because it is the only place writable by
     * all users on the production server.
     */
    'file' => [
        'dir' => '/tmp/',
        'name' => 'tus_php.cache',
    ],
]);

HTTPS at the load-balancer
Locally I’m using a self-signed certificate, so all the traffic reaching the backend is totally Https, but on the production, the ssl is at the load balancer level, which redirects the traffic to our servers as Http only, which tricked Tus into believing that the video url is in Http only, which breaks the uploading, so I had to fix the response headers to add the Https back again:

// in the file controller before sending the response
// get/set headers the way that suits your framework
$location = $response->headers->get('location');
if (!empty($location)) {// `location` is sent to the client only the 1st time
    $location = preg_replace("/^http:/i", "https:", $location);
    $response->headers->set('location', $location);
}

PATCH is not supported
Yet, still not working, it turns out that our production environment setup doesn’t allow PATCH requests, and this is where tus-js-client came to the rescue with its option overridePatchMethod: true which depends on usual POST requests instead.

Re-Uploading starts from 0% !!
Now, everything works fine. On my local machine uploads was lightening fast, so I couldn’t actually test the resumability part of our solution, so let’s try it on the beta, let’s cancel the upload at 40% and try to re-upload it again .. Oh Ooh, it started from 0%, What the heck just happened!
After digging a lot in my server part (which was my suspect), it turns out that tus-js-client has an option called chunkSize with a default value of Infinity, which means upload the whole file at once πŸ™ˆ !!, so I just fixed it with specifying a chunk size of 1MB chunkSize: 1000 * 1000

Wrapping up the whole solution
After putting it all together, here is our final version:

Conclusion
Before Tus I always thought that uploading files has only one traditional way, and no one can touch it, to the extent that I felt that it is pointless even searching for a solution, but never stop at your own boundaries, break them & go beyond, and you will reach new destinations you never thought possible.
Now, uploading large files became dead simple, & I really want to thank the team behind Tus.io for what they did.

Android: Loading images Super-Fast like WhatsApp – Part 1

Zingoo is a new promising app that will rock your weekends, outings and any happening that you want to easily enjoy watching its moments over & over again (we are now doing the Android version, then the iOS one). Because we want Zingoo to be born strong, it has to deliver the best possible [UX] to all awesome-moments lovers around the world, which means we have to do our best in loading the images.

Because we (at Begether) do listen to our users, we heard a lot of comments on how WhatsApp is loading images super-fast, so we dug deeper to know what we can do about it, & here is what we find,

What does WhatsApp do?
WhatsApp is doing the following (numbers are approximate):

  • Photos sizes range around 100KB, which loads the images pretty fast on most common mobile-network speeds. (part 2 explains how to achieve this)
  • Photos are cached, so no need to load them every time you open the app (almost no need to mention this πŸ™‚ ).
  • They first show a very small thumbnail (about 10KB or less) until the real image is loaded, & this is the real pro-tip for their better UX.

The last tip has a different variance by calculating the image dimensions & the approximate color of the image that will be shown & applying it to its placeholder, like the coming 3 minutes in this video:

but still, the thumbnail is away more cooler, right? πŸ˜‰

How is it done?
To achieve the caching there are some good Android libraries out there that are doing a good job, but one of them is doing a way better than the others, which is Picasso. Both caching on disk & on memory are built under the hood, with a very developer-friendly API, I just love what Jake Wharton & his mates did for all of us, thanks guys.
Using Picasso is pretty easy, just like this example one-liner:

Picasso.with(context).load("http://i.imgur.com/DvpvklR.png").into(imageView);

you just need first to add Picasso to your gradle files with the urlConnection library (according to this issue), like this:

compile 'com.squareup.picasso:picasso:2.4.0'
compile 'com.squareup.okhttp:okhttp-urlconnection:2.0.0'

After solving the caching issue, we need to apply the thumbnail great tip, we need to use Picasso 2 times, one for loading the thumbnail and the other for loading the real image like the comment I made on this issue. Also to avoid the thumbnail’s pixlation effect (due to its small size), it would be better to make a blurring effect on it,
WhatsApp's Thumbnail Loading Effect

and here is how it is done:

Transformation blurTransformation = new Transformation() {
    @Override
    public Bitmap transform(Bitmap source) {
        Bitmap blurred = Blur.fastblur(LiveImageView.this.context, source, 10);
        source.recycle();
        return blurred;
    }

    @Override
    public String key() {
        return "blur()";
    }
};

Picasso.with(context)
    .load(thumbUrl) // thumbnail url goes here
    .placeholder(R.drawable.placeholder)
    .resize(imageViewWidth, imageViewHeight)
    .transform(blurTransformation)
    .into(imageView, new Callback() {
        @Override
        public void onSuccess() {
            Picasso.with(context)
                    .load(url) // image url goes here
                    .resize(imageViewWidth, imageViewHeight)
                    .placeholder(imageView.getDrawable())
                    .into(imageView);
        }

        @Override
        public void onError() {
        }
    });

We used the Callback() functionality to start loading the full image after the thumbnail is completely loaded, with using the blurred thumbnail’s drawable as the new placeholder for the real image, & this is how the magic is being done right here :).
Also the blurring made here is Blur.fastblur(), thanks to Michael Evans & his EtsyBlurExample example, you can find this class here.

The only remaining part is how to compress the large images (which could be 2 to 4 MB) to be only about 100 KB, which is discussed in Part 2.