||4 months ago|
|.vscode||1 year ago|
|.dockerignore||1 year ago|
|.gitignore||1 year ago|
|.gitlab-ci.yml||1 year ago|
|.pylintrc||1 year ago|
|Dockerfile||1 year ago|
|README.md||1 year ago|
|api_poller.py||4 months ago|
|docker-compose-dev.yml||1 year ago|
|docker-compose.yml||1 year ago|
|fitbit_export_loader.py||1 year ago|
|influxdb.conf||1 year ago|
Fitbit API exporter
The project is a simple script to export data from Fitbit's Web APIs to a custom InfluxDB database, and then graph it via Grafana.
Everything is made easier via docker and docker-compose. Grafana dashboards are available at:
- API export: https://grafana.com/grafana/dashboards/12348
- Archive data export: https://grafana.com/grafana/dashboards/12374
Quick setup (API exporter)
- basic knowledge of docker is assumed
- unless otherwise noted, leave fields at default
- the script sources environment variables if missing, otherwise sources the saved config data (this is because the token gets refreshed automatically thanks to the refresh token)
- if you don't run the scripts on a server, you may need to refresh the access token and refresh token at the next run, because the refresh token may have already expired (see below)
- Do a test-run of the docker-compose.yml provided, and customize it to your liking, ensure the containers come up, esp. the InfluxDB one
- Go at Fitbit dev login, login with your account
- Register an application: click on "Register an app" at the top
- Application name and description to your liking
- OAuth 2.0 Application Type: Personal
- Default Access Type: Read-Only
- Get first two parameters: click on "Manage my apps", click on your new application; you will need to note down the following in order to fill corresponding env vars:
- OAuth 2.0 Client ID: CLIENT_ID
- Client Secret: CLIENT_SECRET
- Callback URL: CALLBACK_URL
- Generate the tokens: click on the small link at the bottom of the page "OAuth 2.0 tutorial page"
- Select "Flow type": "Authorization Code Flow"
- Select Scopes: activity, heartrate, profile, settings, sleep, weight
- Click on the link at the end of section 1 "We've generate the authorization URL for you, all you need to do is just click on the link below:"
- Select all scopes and click "Allow"
- Copy the code parameter from the URL of the window that opens: code=[.....]#=
- Paste it in the "1A Get Code" form
- Copy the curl call to a script, remove newlines, execute the script
- Copy the JSON output from the script and paste into the "2: Parse response" section
- Note down:
- Access token: ACCESS_TOKEN
- Refresh Token: REFRESH_TOKEN
- You can fill the provided
docker-compose.ymlwith the parameters obtained
- Pull up the containers
Quick setup (Archive export loader)
- Navigate to www.fitbit.com, access with your credentials
- Go to "Account settings", then "Data Export"
- Select "Export account archive" (or similar - the whole of data available)
- Confirm the export via the export email
- Wait for the data to be available
- Download the data
- Extract the archive
- Pull up the
docker-compose up -d influxdb
- Run the docker image:
docker run -it --network fitbit-exporter -v<folder_containing_extracted_archive>:/dump -eDB_HOST=... -eDB_PORT=... -eDB_USER=... -eDB_PASSWORD=... -eDB_NAME=... registry.gitlab.com/fsvm88/fitbit-api-exporter:latest /fitbit_export_loader.py(recheck the network name, as it depends on docker-compose)
- Wait for the container to finish
The command can be rerun as required in case of errors, subsequent runs should skip already-written datapoints using the timestamps of the entries.
NOTE: the data folder must be provided at
user-site-export parent level (e.g: /path/to/folder/username)
- Fitbit APIs are limited to 150calls/hour, the script will detect this and sleep for 1h 10s
- The script will sleep for 4h when there's no additional data to fetch, and only fetch the last day at each round
- Support for older responses of the APIs is limited/non-existing
Intra-day time series
Intra-day time series are not implemented.
I originally wanted to add them, but saw little benefit after finishing the normal time-series.
One additional issue with intra-day series is that they require additional setup on Fitbit's side, and you may need to request access to those via Fitbit support. I have not tried contacting them because I had no need for it.
One last note is that intra-day data is much more granular (1m/30s/1s intervals), so each query will return a lot more data to download and store, which I had no need for.
api_poller.py is all there is to it, use
docker-compose-dev.yml to pull up a local instance for testing
Python script, developed with: