Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sandboxing evaluated JavaScript code #207

Open
mashdragon opened this issue Dec 30, 2023 · 3 comments
Open

Sandboxing evaluated JavaScript code #207

mashdragon opened this issue Dec 30, 2023 · 3 comments
Labels
enhancement New feature or request

Comments

@mashdragon
Copy link

Describe your feature request here.

I would like to run "somewhat untrusted" JavaScript code in PythonMonkey in a sandbox with restrictions on what it is allowed to do. For example, I would like to run code which should not be able to:

  • Access the this.python object
  • Make network requests via XMLHttpRequest or fetch
  • Do anything with the bootstrap object, such as making local file requests

Is this something that could possibly be done? I assume such options would go in the evalOpts argument of eval (documentation of the evalOpts argument would be helpful to have).

Currently I am using this crude approach that just deletes the objects from the globalThis object, but I do not know if it is secure in any way:

for key in ['python', 'bootstrap', 'pmEval', 'XMLHttpRequestEventTarget', 'XMLHttpRequestUpload', 'XMLHttpRequest']:
  del pm.globalThis[key]

Code example

No response

@wiwichips wiwichips added the enhancement New feature or request label Dec 31, 2023
@wiwichips
Copy link
Collaborator

wiwichips commented Jan 5, 2024

Hi @mashdragon , this is a really cool feature request - but probably outside the scope of PythonMonkey.

Interestingly, our main product at Distributive is a distributed compute platform which utilizes edge computing to execute JavaScript / WebAssembly in parallel. Anyone can contribute their compute to our network by running a worker (https://dcp.work/)

In order to do this we have to evaluate JavaScript code within a sandbox https://github.com/Distributed-Compute-Labs/dcp-client/tree/release/libexec/sandbox - you might find that code interesting or relevant

@philippedistributive philippedistributive closed this as not planned Won't fix, can't repro, duplicate, stale Feb 2, 2024
@Xmader
Copy link
Member

Xmader commented May 8, 2024

Might be solved by #208

@Xmader Xmader reopened this May 8, 2024
@wesgarland
Copy link
Collaborator

I won't stake my life on it, but removing everything from the global object is PROBABLY okay for what you're after, as that is how we "give" capabilities to JS in the first place.

Where this might prove vulnerable is attacks on the python engine, accessing it either by walking the prototype chain of supplied methods (eg setTimeout, console.log) or Python type wrappers. Securing Python is completely out of scope for us.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

5 participants