Skip to content

Commit

Permalink
Implemented backoff. It works by simply fishing the Retry-After out o…
Browse files Browse the repository at this point in the history
…f 429 responses and making a recursive call from that block in apiCall(). I've gone through trying to use async/await in a few more places to keep from getting entire barrages of 429 responses back, but that means sending off one request at a time from numerous places in csvData(), waiting for a response, checking it isn't 429, sending off the next request, etc., instead of sending volleys. That ends up being *really slow* for large playlists. Machine-gunning requests with a short delay between each simply works better, especially since I have to make 4 subsequent volleys per playlist (songs from playlist, unique artists, unique albums, song audio features). As a bonus of this process, I realized I don't really need to use delay when I'm fetching the list of playlists, since I already each response, which builds in a delay just due to message transit time to and from the server, and then I realized I could elmininate an unnecessary variable.
  • Loading branch information
pavelkomarov committed Sep 6, 2024
1 parent b722dd2 commit b4cf03e
Showing 1 changed file with 13 additions and 10 deletions.
23 changes: 13 additions & 10 deletions exportify.js
Original file line number Diff line number Diff line change
@@ -1,5 +1,3 @@
rateLimit = '<p><i class="fa fa-bolt" style="font-size: 50px; margin-bottom: 20px"></i></p><p>Exportify has encountered a <a target="_blank" href="https://developer.spotify.com/documentation/web-api/concepts/rate-limits">rate limiting</a> error, which can cause missing responses. The browser is actually caching those packets, so if you rerun the script (wait a minute and click the button again) a few times, it keeps filling in its missing pieces until it succeeds. Open developer tools with <tt>ctrl+shift+E</tt> and watch under the network tab to see this in action. Good luck.</p><br/>'

// A collection of functions to create and send API queries
const utils = {
// Query the spotify server (by just setting the url) to let it know we want a session. This is literally
Expand All @@ -24,7 +22,13 @@ const utils = {
let response = await fetch(url, { headers: { 'Authorization': 'Bearer ' + access_token} })
if (response.ok) { return response.json() }
else if (response.status == 401) { window.location = window.location.href.split('#')[0] } // Return to home page after auth token expiry
else if (response.status == 429) { if (!error.innerHTML.includes("fa-bolt")) { error.innerHTML += rateLimit } } // API Rate-limiting encountered (hopefully never happens with delays)
else if (response.status == 429) {
if (!error.innerHTML.includes("fa-bolt")) { error.innerHTML += '<p><i class="fa fa-bolt" style="font-size: 50px; margin-bottom: 20px">\
</i></p><p>Exportify has encountered <a target="_blank" href="https://developer.spotify.com/documentation/web-api/concepts/rate-limits">\
rate limiting</a> while querying endpoint ' + url.split('?')[0] + '!<br/>Don\'t worry: Automatic backoff is implemented, and your data is \
still downloading. But <a href="https://github.com/pavelkomarov/exportify/issues">I would be interested to hear about this.</a></p><br/>' }
return utils.apiCall(url, access_token, response.headers.get('Retry-After')*1000)
} // API Rate-limiting encountered (hopefully never happens with delays)
else { error.innerHTML = "The server returned an HTTP " + response.status + " response." } // the caller will fail
},

Expand Down Expand Up @@ -57,14 +61,13 @@ class PlaylistTable extends React.Component {

// Retrieve the list of all the user's playlists by querying the playlists endpoint.
// https://developer.spotify.com/documentation/web-api/reference/get-list-users-playlists
let offset = 0, nplaylists = null
let offset = 0, response = null
do {
let response = await utils.apiCall("https://api.spotify.com/v1/users/" + user.id + "/playlists?limit=50&offset=" + offset,
this.props.access_token, offset*2) // only one query every 100 ms
if (!nplaylists) { nplaylists = response.total} // Fish the total number of playlists out of the response.
response = await utils.apiCall("https://api.spotify.com/v1/me/playlists?limit=50&offset=" + offset,
this.props.access_token) // no need for a delay, because I'm awaiting each response, which builds in transit-time delay
playlists.push(response.items)
offset += 50 // playlists can be grabbed up to 50 at a time
} while (offset < nplaylists) // Go again if we haven't gotten them all yet.
} while (offset < response.total) // Go again if we haven't gotten them all yet.

//add info to this Component's state. Use setState() so render() gets called again.
this.setState({ playlists: playlists.flat() }) // flatten list of lists into just a list
Expand Down Expand Up @@ -176,7 +179,7 @@ let PlaylistExporter = {

// This is where the magic happens. The access token gives us permission to query this info from Spotify, and the
// playlist object gives us all the information we need to start asking for songs.
csvData(access_token, playlist) {
async csvData(access_token, playlist) {
let increment = playlist.name == "Liked Songs" ? 50 : 100 // Can max call for only 50 tracks at a time vs 100 for playlists

// Make asynchronous API calls for 100 songs at a time, and put the results (all Promises) in a list.
Expand Down Expand Up @@ -211,7 +214,7 @@ let PlaylistExporter = {
artist_ids = Array.from(artist_ids) // Make groups of 50 artists, to all be queried together
let artist_chunks = []; while (artist_ids.length) { artist_chunks.push(artist_ids.splice(0, 50)) }
let artists_promises = artist_chunks.map((chunk_ids, i) => utils.apiCall(
'https://api.spotify.com/v1/artists?ids='+chunk_ids.join(','), access_token, 100*i))
'https://api.spotify.com/v1/artists?ids='+chunk_ids.join(','), access_token, 100*i)) // volley of traffic, requests staggered by 100ms
return Promise.all(artists_promises).then(responses => {
let artist_genres = {} // build a dictionary, rather than a table
responses.forEach(response => response.artists.forEach(
Expand Down

0 comments on commit b4cf03e

Please sign in to comment.