I don't use Dropbox frequently. In fact, I would prefer not to use it but work requires me to use it from time to time. My only use case was to upload files onto a shared folder. So for my usage, it makes no sense to run the desktop client and is quite painful to login to dropbox and get it done. So I wrote up a very simple dropbox uploader in python that can be used to upload files to a specific folder onto it.

Here's how you would use it to upload a local_file_to_upload.extension to a dropbox folder named db_folder:

$ python simple_db_uploader.py /db_folder local_file_to_upload.extension

Here is the script available (released in the public domain):

import requests
import json
import sys
import os


def exit_on_error(response):
    print("Status Code: {0}".format(response.status_code))
    print("Response: {0}".format(json.dumps(response.json(), indent=4)))
    sys.exit(1)


 def main():
     access_token = "<access_token>"
     upload_file = open(sys.argv[2], 'rb')
     offset = 0
     file_size = os.path.getsize(sys.argv[2])
     chunk_size = 2 * 10**6
     session_id = None
     contents = upload_file.read(chunk_size)
     response = requests.post(
         "https://content.dropboxapi.com/2/files/upload_session/start",
         headers={
             "Authorization": "Bearer {0}".format(access_token),
             "Content-Type": "application/octet-stream"
         },
         data=contents
     )
     if not response.ok:
         exit_on_error(response)
     session_id = response.json()["session_id"]
     offset = chunk_size
     while offset <= file_size:
         response = requests.post(
             "https://content.dropboxapi.com/2/files/upload_session/append",
             headers={
                 "Authorization": "Bearer {0}".format(access_token),
                 "Dropbox-API-Arg": json.dumps({
                     "session_id": session_id,
                     "offset": offset
                 }),
                 "Content-Type": "application/octet-stream"
             },
             data=upload_file.read(chunk_size)
         )
         if not response.ok:
             exit_on_error(response)
         offset = offset + chunk_size
     if offset != 0 and session_id is not None:
         path_args = {
             "cursor": {
                 "session_id": session_id,
                 "offset": file_size
             },
             "commit": {
                 "path": os.path.join(sys.argv[1], upload_file.name),
                 "mode": "overwrite",
                 "autorename": False
             }
         }
         response = requests.post(
             "https://content.dropboxapi.com/2/files/upload_session/finish",
             headers={
                 "Authorization": "Bearer {0}".format(access_token),
                 "Dropbox-API-Arg": json.dumps(path_args),
                 "Content-Type": "application/octet-stream"
             }
         )
         if not response.ok:
             exit_on_error(response)


 if __name__ == '__main__':
     main()

The only thing missing from this script is the access token on line 14. You need to create a dropbox app and assign the various permissions and create a test access token that has to be replaced in the above code and you are ready to go.

This script can be made zingy with tqdm to display a progress bar.

import requests
import json
import sys
import os
from tqdm import tqdm


def exit_on_error(response):
    print("Status Code: {0}".format(response.status_code))
    print("Response: {0}".format(json.dumps(response.json(), indent=4)))
    sys.exit(1)


 def main():
     access_token = "<access_token>"
     upload_file = open(sys.argv[2], 'rb')
     offset = 0
     file_size = os.path.getsize(sys.argv[2])
     chunk_size = 2 * 10**6
     session_id = None
     contents = upload_file.read(chunk_size)
     response = requests.post(
         "https://content.dropboxapi.com/2/files/upload_session/start",
         headers={
             "Authorization": "Bearer {0}".format(access_token),
             "Content-Type": "application/octet-stream"
         },
         data=contents
     )
     if not response.ok:
         exit_on_error(response)
     session_id = response.json()["session_id"]
     num_chunks = (file_size / chunk_size)
     offset = chunk_size
     for chunk in tqdm(range(num_chunks)):
         response = requests.post(
             "https://content.dropboxapi.com/2/files/upload_session/append",
             headers={
                 "Authorization": "Bearer {0}".format(access_token),
                 "Dropbox-API-Arg": json.dumps({
                     "session_id": session_id,
                     "offset": offset
                 }),
                 "Content-Type": "application/octet-stream"
             },
             data=upload_file.read(chunk_size)
         )
         if not response.ok:
             exit_on_error(response)
         offset = offset + chunk_size
     if offset != 0 and session_id is not None:
         path_args = {
             "cursor": {
                 "session_id": session_id,
                 "offset": file_size
             },
             "commit": {
                 "path": os.path.join(sys.argv[1], upload_file.name),
                 "mode": "overwrite",
                 "autorename": False
             }
         }
         response = requests.post(
             "https://content.dropboxapi.com/2/files/upload_session/finish",
             headers={
                 "Authorization": "Bearer {0}".format(access_token),
                 "Dropbox-API-Arg": json.dumps(path_args),
                 "Content-Type": "application/octet-stream"
             }
         )
         if not response.ok:
             exit_on_error(response)


 if __name__ == '__main__':
     main()

Dropbox file upload handler for django

Tue 17 July 2012 by Thejaswi Puthraya

Cross posted from work blog.

Dropbox announced new pro plans last week and some accounts have had their storage size doubled. Wouldn't it be wonderful if we could upload all our files to dropbox from our django webapp?

In this post, I write a custom file upload handler that will …

read more