Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Increase backend payload size limit to 2MB #1374

Merged
merged 1 commit into from
Jan 24, 2025

Conversation

nygrenh
Copy link
Member

@nygrenh nygrenh commented Jan 24, 2025

Summary by CodeRabbit

  • New Features
    • Increased server payload size limit to support larger requests.
    • Enhanced server configuration to handle more substantial data transfers.

Copy link
Contributor

coderabbitai bot commented Jan 24, 2025

Walkthrough

The changes modify the server configuration in config.rs to increase payload size limits for incoming requests. The modifications introduce a new PayloadConfig configuration that expands the maximum JSON and general payload sizes from 1MB to 2MB. This update allows the server to handle larger incoming requests by adjusting the configuration settings, enabling more flexible data transmission for the headless LMS service.

Changes

File Change Summary
services/headless-lms/server/src/config.rs - Added PayloadConfig import from actix_web
- Increased JSON payload limit to 2,097,152 bytes
- Created new payload_config variable
- Updated ServerConfig struct to include payload_config
- Modified configure function to register payload configuration

Poem

🐰 Bytes dancing, limits expanding wide,
A rabbit's config takes a joyful stride!
Two megabytes now flow with grace,
Payload limits find their new embrace
Hopping through data, swift and light! 🚀


Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR. (Beta)
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (1)
services/headless-lms/server/src/config.rs (1)

66-68: Creation of payload_config is correct and consistent.
Having a separate PayloadConfig object for non-JSON request handling is a neat way to uniformly apply the 2 MB limit. If you want to keep the entire system consistent, ensure both JSON and generic payloads share the same environment-driven limit or constants in the future.

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 6ba6ea7 and d09888e.

📒 Files selected for processing (1)
  • services/headless-lms/server/src/config.rs (5 hunks)
⏰ Context from checks skipped due to timeout of 90000ms (3)
  • GitHub Check: build-and-deploy
  • GitHub Check: build
  • GitHub Check: headless_lms
🔇 Additional comments (6)
services/headless-lms/server/src/config.rs (6)

10-10: Import changes look good.
Adding PayloadConfig to the list of imports is consistent with the goal of configuring payload size limits for incoming requests.


55-55: Confirm that the 2 MB JSON limit is sufficient and appropriate.
Raising the JSON payload limit to 2 MB can handle larger requests, which aligns with your PR objective. Ensure that your application’s performance and security needs (e.g., malicious large requests) are adequately addressed.


108-108: Addition of the payload_config to ServerConfig is properly integrated.
This ensures you can pass the payload size limit around as a first-class config.


116-116: Field addition in ServerConfig is straightforward.
Exposing pub payload_config: Data<PayloadConfig> cleanly allows other parts of the code to reference this limit if needed.


140-140: Destructuring payload_config in server config is valid.
This matches the pattern for other fields and sets you up to use the payload config in the Actix app.


146-146: Registering .app_data(payload_config) is aligned with Actix best practices.
This properly enforces the limit in all routes under the configured scope. Make sure to test large payload scenarios to confirm the 2 MB threshold is enforced as expected.

@nygrenh nygrenh enabled auto-merge (squash) January 24, 2025 15:17
@nygrenh nygrenh merged commit e0dfc6b into master Jan 24, 2025
18 checks passed
@nygrenh nygrenh deleted the increase-max-payload-sizes branch January 24, 2025 15:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants