Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

lots of various improvements.. #283

Open
wants to merge 36 commits into
base: master
Choose a base branch
from

Conversation

warren-bank
Copy link

@warren-bank warren-bank commented Jan 16, 2022

both in terms of making the design simpler for adding new providers..
and in terms of adding features (such as re-adding support for the hCaptcha provider).

status:

  • builds and runs great

to-dos:

  • although I updated the tests, I haven't actually run them yet.. so I'm sure they'll need an hour or two to get everything passing

add:
 * LICENSE
 * crx and xpi build scripts
 * Firefox ID in manifest
   - for installing an unsigned xpi in FF Developer edition

fix:
 * webpack config to support building on Windows

change:
 * name of output directory from 'dist' to 'PrivacyPass'
 * npm 'clean' script to include:
   - typescript output 'lib' directory
   - crx and xpi files generated by new build scripts
 * npm 'sjcl' script to explicitly call perl,
   rather than depending upon the shell to parse the shebang line
use chrome.i18n to internationalize strings in UI
* update message passing between userscript and background page
* update redux store used by popup

summary:
========

* messages are currently only passed in one direction:
  - from userscript to background page (w/ response to callback)
  - messages:
    1. {clear: true}
       - removes all passes for all providers from local storage
       - no response
    2. {tokensCount: true, providerID: number}
       - providerID is:
         * optional;
           if not included, then the response includes all providers
         * the same integer that identifies each unique provider
             1 == Cloudflare
             2 == hCaptcha
             etc..
       - response: {[key: string]: number}
         * each key is a `${providerID}`
         * each value is a count of the number of passes in wallet
       - for example:
           {"1": 30, "2": 5}

* the data structure used by the redux store to hold state
  is identical to the response from the background page
  to messages that request 'tokensCount'
* refactor the 'voprf' crypto module into a reusable class,
  so each provider can own a distinct instance

* add the dependency: 'keccak'
  - v2 of the extension included a snapshot of this library
  - v3 removed it, but there is still code that wants to use it
    * presumably, this branch of code will get called
      when the "hCaptcha" provider is reimplemented (..coming soon)
* somewhere between a bug fix and a performance optimization:
  - pass an instance of the 'voprf' crypto class
    from provider(s) to the Token constructor
  - otherwise, each Token initializes its own instance
    of the 'voprf' crypto class using default parameters
* hCaptcha provider is added, but not yet functional

status:
=======
* "handleBeforeRequest()" detects when
  a captcha is solved on an issuing domain
* "issue()" makes a subsequent request
  for tokens to be signed
* the response data does include "signatures",
  which can be properly parsed:
    {
      sigs:    string[];
      proof:   string;
      version: string = "1.0";
    }

to do:
======
* the code currently expects that the commitment
  for the version to be formatted the same as Cloudflare:
    {H: string; expiry: string; sig: string;}
* however, the JSON data file:
    https://raw.githubusercontent.com/privacypass/ec-commitments/master/commitments-p256.json
  shapes the data for version HC["1.0"] differently:
    {H: string; G: string;}
* annecdotally, the part that hurts most..
  is that this data file includes a version HC["1.01"]
  that is shaped in a way that is consistent and would work
* hCaptcha provider is added, but not yet functional

status:
=======
* "handleBeforeRequest()" detects when
  a captcha is solved on an issuing domain
* "issue()" makes a subsequent request
  for tokens to be signed
* the response data does include "signatures",
  which can be properly parsed:
    {
      sigs:    string[];
      proof:   string;
      version: string = "1.0";
      prng?:   string = undefined;
    }
* the version "1.0" commitment is shaped:
    {H: string; G: string;}
  which, unlike Cloudflare, does not require verification
* since "prng" is not defined,
  it should default to "shake",
  as was the methodology used by the v2 extension

to do:
======
* the "shake" prng has a missing dependency: Buffer
@warren-bank
Copy link
Author

warren-bank commented Jan 16, 2022

here are releases that include unsigned packed extensions.. for browsers that allow them to be installed

ex:

@ppopth
Copy link
Member

ppopth commented Jan 17, 2022

  1. Is this still in progress?
  2. What did you make .bin a hidden directory?

@ppopth
Copy link
Member

ppopth commented Jan 17, 2022

After all thank you for the patch 😄

@@ -38,7 +38,7 @@ $ npm ci
$ npm run build
```

After that, the `dist` folder will contain all files required by the extension.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why did you change the name to PrivacyPass? I think the name PrivacyPass doesn't indicate that the files in the directory are generated files.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

because when Chrome is used to pack an extension, both the name of the .crx and the .pem are expected to be the same as the input directory (all siblings)..

.gitignore allows the .pem to be permanently homed where Chrome expects it to be.. without ever being commit to version control.. and likewise, the generated .crx and .xpi files won't either

this naming convention is associated with the scripts in the .bin directory.. if you already have tools that you use to package your extensions, then.. you:

  1. won't want/need the .bin directory
    • though I find these scripts useful
  2. won't want to rename the output directory.. as you tooling will already be configured to use the old path

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

another option.. which might be preferable.. would be to:

  • move the .bin directory to dist/.bin
  • configure webpack to output to dist/PrivacyPass
  • configure .gitignore to exclude: dist/PrivacyPass and dist/PrivacyPass.[pem|crx|xpi]

so all output.. and my scripts to pack the output into extensions.. would be confined under dist

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

update:
That last suggestion seemed (to me) as a much cleaner option.. so I just pushed a commit to make those changes.

@ppopth
Copy link
Member

ppopth commented Jan 17, 2022

I think many directory names and files names are unnecessary too long. For example, psuedorandom-number-generators, I think by putting just prngs, most people will understand it already

const cached = this.getStoredTokens();
this.setStoredTokens(cached.concat(tokens));
// delay the request to issue tokens until next tick of the event loop
setTimeout(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do you change to setTimeout over here?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

because (if memory serves) that's a blocking API method.. and there's no need for the browser to wait for the secondary request to finish; using a next tick allows the triggered code to run asynchronously.

maybe there's a good reason to not do it this way, but it struck me as a better idea while I was touching that portion of code.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

actually.. my answer just now was wrong..

(async () => {...})()

wouldn't block..

I think my reason for adding the timeout in the first place had to do with hCaptcha. I don't entirely understand how their process works.. but it seems (from trial and error) very clear that the intercepted request needs to complete before the triggered request to issue tokens can be made.. otherwise it fails. So.. i don't know if removing this timeout would break hCaptcha or not.. but there's a good chance that it might.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Of course.. these comments about hCaptcha don't apply to the plugin for the Cloudflare provider; the update to Cloudflare was just because I wanted the 2x plugins to keep as nearly identical as possible.. for symmetry.. so any differences in logic can be easily grok'd by diff'ing the two.

Copy link
Author

@warren-bank warren-bank Jan 19, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

update:
Regarding my earlier comment/assertion that hCaptcha requires the timeout.. and also..
Regarding my earlier comments about testing the ES5-transpiled release of the extension in older browsers..
I just ran a very interesting test that addresses both topics:

  • added a polyfill library to the public directory and loaded it from both the background page and the popup
  • using Chrome 30:
    • packaged the extension to a crx2 and installed
      • no errors or warnings.. everything looks good
    • solved an hCaptcha and inspected network traffic
      • request from normal window tab to solve captcha: 200
      • request from background page to issue tokens: 403
        • {"success":false,"error-codes":["invalid-data"]}
        • which is the response that occurs when the request to issue tokens happens concurrent to the POST to solve
  • made a quick change to the timeout in the background page: increased from 0 to 2500 milliseconds
    • repacked the crx2 and reinstalled
    • solved another hCaptcha and... earned 5x tokens
      • it works in Chrome 30! ..how cool is that?
  • the Cloudflare provider doesn't work out-of-the-box
    • haven't looked into it yet
    • guessing it's because the TLS certificate isn't trusted by this browser
      • which would make the necessary window.crypto API unavailable
      • I'll need to add its root authority.. and retest..
    • update:
      • the problem with the TLS certificate isn't that the root CA is untrusted.. it validates OK
      • the problem is actually that the generic top-level domain used by the CF website that issues tokens (captcha.website) isn't recognized by the browser as a public domain.. here is a related thread on this exact error message
    • update:
      • the TLS certificate might not actually be the cause for CF not working..
      • the only window.crypto API needed is available from a non-secure context
        • I was originally thinking of window.crypto.subtle.. which I was using recently in another project
      • the network trace doesn't seem to show any requests that match the criteria used by the extension to trigger an issue request..
        • I'll need to debug.. and see if the code to trigger that secondary request is actually firing
    • confirmed:
      • the PrivacyPass chrome extension is working
      • the issue is that Cloudflare captchas don't work in this browser
        • presumably, whatever version of Chrome/FF that is minimally supported by the website/service will also work with the extension (w/ ES6+ polyfills)

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just added another commit to this branch.. it's tagged as v3.4.0 with a corresponding crx under releases

this design is:

  1. much cleaner
  2. much more reliable, since the timing of the secondary request depends upon completion of the 1st

the result is:

  • stable and fully functional

Copy link
Author

@warren-bank warren-bank Jan 20, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PS: this new release was built using the improved build scripts, which automates the insertion of the ES6+ polyfill library (that I mentioned previously) into the CRX2 extension.. so

  • the CRX3 does not include polyfill
    • since it only runs in newer browsers
  • the CRX2 does include polyfill
    • since it only runs in legacy browsers
    • this backport adds support for some crazy old browsers..
      • the primary bottleneck now is whether or not the websites operated by each provider (ie: where captchas are solved) will also support the older browsers


const CHL_BYPASS_SUPPORT = 'cf-chl-bypass';
const DEFAULT_ISSUING_HOSTNAME = 'captcha.website';
const ALL_ISSUING_CRITERIA: {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Making the criteria programmable is more flexible than making it configurable. Please read https://blog.cloudflare.com/privacy-pass-v3/ for more detail. We had difficultly by making things configurable in the past because configuration is less flexible.

That is, if we want to change the criteria, it makes more sense to change the code in some function rather change some configuration object.

Copy link
Author

@warren-bank warren-bank Jan 17, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I absolutely agree with you..

  • that it would be bad to:
    • only allow providers to supply data used to configure global behavior (within narrow limits)
  • that it is much better to:
    • require that each provider implement a set of hook function

However, since some tasks are repeated by all providers.. it makes sense (to me) to provide tooling in the form of shared static methods that providers have the option to utilize from within their custom hook functions.

Those tools will of course require formatted input that providers who choose to call these methods will need to supply; that's all that these particular data structures are for.

Copy link
Author

@warren-bank warren-bank Jan 17, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

to make my earlier comment more concrete..

handleBeforeRequest() needs to quickly determine whether the intercepted request:

  1. solves a captcha belonging to the particular provider
  2. is being solved on a domain that issues tokens

these determinations are always based on:

  1. the requested url
  2. POST data (if present)

I added static methods that allow providers to pass the data from the request as well as a configuration object.. and the result is a simple yes or no.. the request does or doesn't match the filter criteria as configured

If a provider wants to apply specialized logic.. then that's great; simply don't use these methods and roll your own.

@warren-bank
Copy link
Author

warren-bank commented Jan 17, 2022

regarding using long/descriptive file and directory names..

  • I figure that since the code gets bundled together during the build process and these names are only ever seen by the dev team.. there's no harm in using them
  • I'll be the first to admit that I had to google the meaning of prng :)

I'm sure that a similar observation/criticism could be made of my choice for using long/descriptive class and variable names..

  • to which I would make a similar response, that code minification does away with these names during the build process
    • I've enabled minification for production builds
    • the upstream repo has it completely disabled

In both cases, it's a matter of personal preference and whatever conventions the project chooses to use. I prefer descriptive names so I don't have to relearn the code when I look at it months or years later. But, it's not my project.. so do as you like.

@warren-bank
Copy link
Author

warren-bank commented Jan 17, 2022

regarding whether this is still a work in progress..

without getting too philosophical.. :)

  • the resulting extension is stable and works exactly as it should
  • I've accomplished my initial goals.. and am satisfied with the result

so.. nope; I'd consider it a release candidate.

of course, we can discuss some of the decisions and make a few tweaks.. but this definitely moves the ball forward.

@warren-bank
Copy link
Author

One thing that I didn't touch.. but would take a good look at if it was my project.. is the version of the ecmascript in the generated bundle that gets packed into the extension. It's currently set to ES2020.

Personally, I always like to support as many users as possible.. and would probably down level all the way to ES5 (if feasible).

@warren-bank
Copy link
Author

PS: regarding my earlier comment about needing to test the tests.. I made an additional commit a few hours ago that fixes the tests (mainly to adjust for how to confirm that the code in the new timeouts has run). Now everything passes.

@warren-bank
Copy link
Author

warren-bank commented Jan 17, 2022

I just added another commit to this branch.. it's tagged as v3.3.1 with a corresponding crx under releases

  • makes a tiny change to the webpack configs
  • the resulting crx:
    • is nearly half the size
    • contains ES5 code that will run in many more (ie: older) browsers
      • off-hand, I'd guess that support for window.crypto.getRandomValues is the new bottleneck, but according to this.. it was available in Chrome v11

If there's a reason that you've chosen to target ES2020.. feel free to exclude this commit.

Annecodally, I tested the resulting crx and it works exactly the same.

update:

  • Out of curiosity.. I just tried to repack this extension in crx2 format and install on Chrome v30
    • which I like to keep of copy of.. since it's nearly identical to the embedded WebView in Android 4.x
    • it installed, but the console log from the background page shows an error:
      • that the bundled javascript still includes const declarations
      • which I confirmed by grep'ing for that text in the .js bundle
  • something hinky is going on
    • this needs some more attention..

possible cause:

  • I believe that:
    • the config change in the last commit only effects the javascript glue added by webpack
    • the typescript can also target ES5 by changing in tsconfig.json:
        {
            "compilerOptions": {
                "target": "ES5",
                "downlevelIteration": true,
            },
        }
    • however, plain-old javascript is not being transpiled
      • originating from both within the project, as well as external dependencies
      • for example.. the buffer polyfill
  • I'm going to play around with reconfiguring webpack to use babel more directly..

update:

  • release tag v3.3.2 fixes this and the new crx is 100% ES5.. it was an issue with the webpack configuration.. needing to add an exception to the exclude regex for particular node modules that need to be transpiled
  • the bottleneck in Chrome 30 is that buffer v5.x uses the Uint8Array typed array.. and adds a notification to the console log in browsers that don't support this language feature that they should downgrade to use buffer v4.x instead
    • according to this, buffer v5.x should work in Chrome 39+ ..but will need to test to confirm

@warren-bank
Copy link
Author

warren-bank commented Jan 17, 2022

2x points to bring up (before I forget)..

  1. In the v2 extension, I'm pretty sure that the hCaptcha provider defaulted to issuing 5x tokens per solved captcha because no explicit value was defined.. and 5x was inherited from the default configs. When configuring the new hCaptcha provider, I set the value to 5x tokens.. because I didn't know if it would be OK to increase the number.. for example to 30x, as Cloudflare does. Do you know what limit is set on the server-side? ..whether it is 5.. or if we could go higher?
  2. It would appear that the way hCaptcha issues tokens prevents the ability to do so from within an incognito window; ..or rather, makes it more difficult.. and will require a workaround. My observation is that the user's POST containing the captcha solution (Req1) needs to occur before our POST containing the tokens to be signed (Req2). Furthermore, when Req1 occurs in an incognito window, Req2 always fails. My assumption is that the response to Req1 includes a cookie that needs to be sent back in Req2. That would explain why an incognito window (which sandboxes its cookies) would prevent Req2 (which is made by the background page and doesn't live within the incognito sandbox) from working. Is this something that you've already looked into.. and brainstormed workarounds for?

fix: transpile ES6 node modules: 'buffer' and 'keccak'
change:
 * name of output directory from 'PrivacyPass' to 'dist/PrivacyPass'
 * location of crx and xpi build scripts from '.bin' to 'dist/.bin'
 * location of pem input and crx/xpi output files from '/' to 'dist'
* the methodology used in this commit does NOT work
  - I fully expected that it would
  - it does NOT
  - this commit is being tagged so I can point to it later,
    if asked: "why didn't you just...?"
* the methodology is to:
  - detect requests that solve catchas made to issuing domains
    * as was done previously
  - rather than make the issuing request then and there,
    wait until the corresponding call to "handleHeadersReceived"
* problem:
  - there (apparently) is no corresponding call to "handleHeadersReceived"
* to do:
  - add new hook for call to "chrome.webRequest.onCompleted"
  - add plumbing for all providers to receive this new hook function
  - ...hope it receives a corresponding call

- - - -

hCaptcha should send issuing requests from a different hook function

similar to the way redemption works (but in reverse),
detect solved captcha on issuing domains before requests are sent,
but delay the processing of hits until after the response is received

previously, a timeout was used to delay processing;
but this methodology was inexact and could lead to failed attempts.

note that this is in stark contrast to the Cloudflare provider,
which cancels the requests that it detects,
and processes those hits immediately without any delay.
@warren-bank
Copy link
Author

warren-bank commented Jan 29, 2022

Are you using Firefox?

Earlier you mentioned using Chrome. My tests with Chrome show a 100% success rate. But.. maybe there are other variables (such as geography, network speed, hardware speed) that could result in a different outcome. Dunno..

@MikeGitAZ
Copy link

No, Chrome. That was half joking although I definitely get frustrated with the captchas at times.

@warren-bank
Copy link
Author

Oh ok. Well.. personally, I couldn't care less about Firefox.. but this is just super frustating.. because the extension appears to be doing everything right. Without knowing what's going on at the server.. it's just banging my head against a wall.

@MikeGitAZ
Copy link

I understand. Personally, I was somewhat surprised to discover that such an extension/add-on could even have a chance to work. I figured that the captcha people wouldn't like the idea of it and would code defensively against such functionality, even though to me it doesn't seem like much of a threat. It's just that they're all about gatekeeping.

@warren-bank
Copy link
Author

warren-bank commented Jan 29, 2022

From the first sentence of the README:

The Privacy Pass protocol is now being standardised by the privacypass IETF working group.

This project isn't about finding a clever way to work around the captcha providers. My understanding is that they want this solution to work, and are in collaboration to establish a standard. To do so doesn't make their services obsolete.. quite the opposite; it makes their services more transparent.. less invasive.. and more likely to be used ubiquitously across the interweb.

@MikeGitAZ
Copy link

Yes, I get that, but I was rather surprised when I first found out about it. I'm just saying that security types tend to be on the cautious side.

@warren-bank
Copy link
Author

warren-bank commented Jan 30, 2022

@MikeGitAZ ..I just tested the 3x URLs that have been listed in this conversation from a VPN server in São Paulo, Brazil:

  • https://www.ancestry.com/search/collections/30105/
    • success
    • Cloudflare: redeems 1x token per session (not per page load)
  • https://streameast.live/
    • success
    • Cloudflare: redeems 1x token per session (not per page load)
  • https://aaep.org/contact
    • failure
    • hCaptcha: this is interesting..
      • in the US: hCaptcha is the only provider on the page (checkbox in iframe)
        • hCaptcha plugin is active in extension
        • captcha is solved automatically by hCaptcha plugin when checkbox is clicked
      • in Brazil: in addition to hCaptcha, Cloudflare is detected on the page (though no tokens are redeemed during page load)
        • Cloudflare plugin is active in extension, because it is intentionally prioritized
        • captcha is not solved automatically by hCaptcha plugin when checkbox is clicked, because the hCaptcha plugin is not active

@warren-bank

This comment has been minimized.

@warren-bank
Copy link
Author

warren-bank commented Jan 30, 2022

disregard my last comment..

  • I thought I found the underlying cause of the problem with hCaptcha in Firefox (re: blocked 3rd-party cookie)
  • it seemed to be working perfectly during testing
    • which annecdotally always seems to be the case immediately after I restart Firefox
  • old/broken behavior a few minutes later

I'm not even sure if cookies are used to authenticate the 2nd request..

  • the network tab doesn't indicate that any cookies are sent by either successful nor unsuccessful requests (see attached screenshot)
  • it's possible that the value in the URL pathname is a UUID
    • it is also included in the 1st request that contains the solution to the captcha, so maybe this UUID is temporarily authorized to make the 2nd request.. dunno
    • but that doesn't make any sense either based on the observed behavior, because it doesn't explain why issuing hCaptcha tokens doesn't work in Chrome when the 1st request occurs in an incognito window
    • really need some insight from hCaptcha into wtf is going on..

Firefox-screenshot-2

Firefox-screenshot-3

@warren-bank
Copy link
Author

warren-bank commented Jan 30, 2022

summary of what does consistently work correctly (v3.6.6):

  • new features in extension popup:
    • ability to backup all tokens to a JSON text file
      • Chrome 85, Chrome 67, Chrome 30, Firefox 97-dev
    • ability to restore tokens from a JSON text file
      • Chrome 85, Chrome 67, Chrome 30, Firefox 97-dev
  • Cloudflare
    • ability to issue tokens from a normal window
      • Chrome 85, Chrome 67, Chrome 30, Firefox 97-dev
    • ability to issue tokens from a private/incognito window
      • Chrome 85, Chrome 67, Chrome 30, Firefox 97-dev
    • ability to issue tokens when a "User-Agent" switcher extension is active
    • ability to redeem tokens from a normal window
      • Chrome 85, Chrome 67, Chrome 30, Firefox 97-dev
    • ability to redeem tokens from a private/incognito window
      • Chrome 85, Chrome 67, Chrome 30, Firefox 97-dev
    • ability to redeem tokens when a "User-Agent" switcher extension is active
  • hCaptcha
    • ability to issue tokens from a normal window
      • Chrome 85, Chrome 67, Chrome 30, Firefox 97-dev
    • ability to issue tokens from a private/incognito window
      • Chrome 85, Chrome 67, Chrome 30, Firefox 97-dev
    • ability to issue tokens when a "User-Agent" switcher extension is active
    • ability to redeem tokens from a normal window
      • Chrome 85, Chrome 67, Chrome 30, Firefox 97-dev
    • ability to redeem tokens from a private/incognito window
      • Chrome 85, Chrome 67, Chrome 30, Firefox 97-dev
    • ability to redeem tokens when a "User-Agent" switcher extension is active

summary of unresolved issues (v3.6.6):

  • Cloudflare
  • hCaptcha
    • ability to issue tokens from a normal window is erratic
      • Firefox 97-dev
    • ability to issue tokens from a private/incognito window is erratic
      • Chrome 85, Chrome 67, Chrome 30, Firefox 97-dev

notes:

  • hCaptcha
    • erratic behavior
      • behavior by the extension is consistent
      • response to requests made by the extension is not consistent

During some real-world testing,
I noticed that Chrome doesn't always parse
'application/x-www-form-urlencoded' data in the POST body.

This new helper method is available to providers to normalize
the format of this data into a key/value hash object.

Furthermore, it will also parse 'application/json' data.
previous methodology:
=====================
* popup window:
  - dynamically creates an input file element
  - click event triggers the element to open the file chooser dialog
  - processes the list of selected files by:
    * reading the content of each file
    * parsing its JSON
    * passing the resulting object in a message to the background page
* background page:
  - merges backup data with the tokens already saved in local storage
  - updates local storage with this new aggregate value
  - updates the popup window

problem with previous methodology:
==================================
* only works if the browser doesn't close the popup window
  before the data is passed to the background page
* many browsers do close the popup window
  when the file input dialog is triggered for selection of input files

new methodology:
================
* popup window:
  - sends a message to the background page
* background page:
  - opens a new tab and loads a static html page
* static html page:
  - dynamically creates an input file element
  - adds a click event handler to the element,
    which requires user interaction to trigger
  - the click event handler processes the list of selected files by:
    * reading the content of each file
    * parsing its JSON
    * passing the resulting object in a message to the background page
  - the click event handler also tracks the count of files pending
    * after the processing of all files is complete,
      sends a final message to the background page
* background page:
  - merges backup data with the tokens already saved in local storage
  - updates local storage with this new aggregate value
  - closes the tab containing the static html page

comparison between methodologies:
=================================
* previous methodology:
  - pros:
    * simpler implementation
    * more elegant user experience
  - cons:
    * doesn't work in many browsers
* new methodology:
  - pros:
    * works in all supported browsers
  - cons:
    * much more complicated implementation
    * less elegant user experience,
      which requires interaction with a standalone page in a new tab
@ppopth
Copy link
Member

ppopth commented Feb 19, 2022

Hi @warren-bank, Since this PR is quite huge and it touches the code structure of the project which needs discussion, would you mind if I ask you to create a smaller PR first? (For example, the PR which only makes hCaptcha tokens work)

Another one is could you make the commit message up-to-date? for example, "work in progess.." means that the work is not finished yet.

There are many things that should be done by the maintainer, such as updating the version number.

I'm sorry for the delayed response and thank you for the PR.

@warren-bank
Copy link
Author

warren-bank commented Feb 19, 2022

Since this PR is quite huge and it touches the code structure of the project which needs discussion, would you mind if I ask you to create a smaller PR first? (For example, the PR which only makes hCaptcha tokens work)

Though I have noticed that a few of my more minor tweaks have already been cherry picked into standalone PRs that have already been merged.. the trouble with this request is that none of the more serious aspects of the code can live in a sandbox, isolated from the rest of the project.

Using the hCaptcha provider plugin as an example:

  • it requires hooks to listeners for chrome.webRequest events that don't exist
  • it requires a library of common tools that doesn't exist (and much earlier, you voiced dislike for the idea of having)
  • ...and probably many other things that don't immediately come to mind, as I haven't looked at this code in quite a while

Another one is could you make the commit message up-to-date? for example, "work in progess.." means that the work is not finished yet.

There are many things that should be done by the maintainer, such as updating the version number.

These commit messages and version numbers were made during a period of very active development in a separate fork. I never assumed that the dev branch would be merged verbatim. If you were to perform a squashed merge, then:

  • all of the commit messages (both the way way overly descriptive ones, as well as non-descriptive ones) will go away
  • the version number can be rolled back to fit the official numbering chronology prior to commit

My point is.. that none of this should matter.


Do you think it would be easier to discuss and selectively undo the parts you don't like?

@armfazh
Copy link
Member

armfazh commented Feb 21, 2022

Do you think it would be easier to discuss and selectively undo the parts you don't like?

What if we add the parts incrementally with small PRs.
So far, there are a lot of changes and they are difficult to track and review (not saying they are not useful).

I can spot some features that are, by themselves, good additions for the extension:

  • Localization engine.
  • Browser detection
  • cookies handler
  • Any other UI change.
  • hCaptcha support (this may be a big change too).

@warren-bank would you like to create PRs addressing specific changes, or are you ok if we cherry pick changes from your fork?
We appreciate your input and willingness to collaborate :)

@warren-bank
Copy link
Author

would you like to create PRs addressing specific changes, or are you ok if we cherry pick changes from your fork?

you can cherry pick code from the fork, if:

  • you add back the same LICENSE to your repo that was present @ v2.0.9, which has since been removed
  • you're willing to add me to your list of contributors

regarding your ability to incrementally add features, rather than lump sum..
I'd highly recommend that you inspect the code diff for each commit in this PR in chronological order..
in addition to the contemporaneous comments/thoughts/questions written to the "conversation" section of this PR;
most features were added one at a time, and build on previous changes/additions/enhancements.
you might be able to say something like:

  • commits a-b are PR 283.1
  • commits b-c are PR 283.2
  • etc..

We appreciate your input and willingness to collaborate

I'm absolutely willing/eager to collaborate;
I'm just leary of being asked to start over and rewrite history..
I'd be much more interested in collectively trying to figure out how to fix/workaround the remaining issues..
so the extension's behavior is consistent and reliable for all users

probably not perfect, but hopefully a good starting point;
PRs are welcome.
* each row of providers contains: name, token count
  - previously, 2x floating divs were used for layout
    * there were no css rules to properly display text overflow
    * when only english strings were displayed,
      this was acceptable
    * when other language translations are displayed,
      this goes wrong very quickly
  - now, a simple 1x row by 2x column table element is used for layout
    * though a flexbox layout would be perfectly suited,
      it's only supported by newer browsers
    * table elements have always existed,
      and it works equally well

example:
========
* run Chrome using strings translated for German locale
  - bash:
      LANGUAGE='de' && chrome --lang=de
  - cmd in Windows:
      set "LANGUAGE=de" && chrome --lang=de
previously, request to issue tokens required querystring parameter:
  __cf_chl_captcha_tk__

now, the name of this parameter has been changed to:
  __cf_chl_f_tk
the detection criteria allows for any of several parameter names,
but the backend endpoint to issue Cloudflare tokens
only accepts a single parameter name.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants