Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Modeling exercises: Add preliminary AI feedback requests for students #9278

Merged

Conversation

LeonWehrhahn
Copy link
Contributor

@LeonWehrhahn LeonWehrhahn commented Sep 4, 2024

Checklist

General

Server

  • Important: I implemented the changes with a very good performance and prevented too many (unnecessary) and too complex database calls.
  • I strictly followed the principle of data economy for all database calls.
  • I strictly followed the server coding and design guidelines.
  • I added multiple integration tests (Spring) related to the features (with a high test coverage).
  • I added pre-authorization annotations according to the guidelines and checked the course groups for all new REST Calls (security).
  • I documented the Java code using JavaDoc style.

Client

  • Important: I implemented the changes with a very good performance, prevented too many (unnecessary) REST calls and made sure the UI is responsive, even with large data (e.g. using paging).
  • I strictly followed the principle of data economy for all client-server REST calls.
  • I strictly followed the client coding and design guidelines.
  • I added multiple integration tests (Jest) related to the features (with a high test coverage), while following the test guidelines.
  • I added authorities to all new routes and checked the course groups for displaying navigation elements (links, buttons).
  • I documented the TypeScript code using JSDoc style.
  • I added multiple screenshots/screencasts of my UI changes.

Motivation and Context

Currently, students cannot receive feedback on modeling exercises, except by asking tutors. This PR aims to change that by enabling the option for students to request AI feedback on their modeling exercises.

Description

This PR does not focus on the quality of the feedback generation, other Athena PRs, e.g. #340 address this

This PR builds upon Enea's Text exercises Preliminary AI Feedback and also introduces AI-generated feedback for Modeling Exercises.

In Modeling Exercises, a new option now allows students to request AI-generated feedback, with a predefined limit on the number of requests per student. Once requested, the feedback is automatically saved and attached to the relevant submission. Students can request feedback once for each submission.

Previously, modeling exercises operated with a single, continuously updated submission. Now, if Athena feedback is attached to an existing submission, a new submission will be created when a student resubmits their work.

Additionally, tutor assessments have been updated to ignore Athena feedback and retrieve the latest submission, rather than defaulting to the initial one as was previously the case.

Steps for Testing

Only Deploy to TS1

1. Instructor Setup:

  • Create a new modeling exercise with the AI feedback option enabled.
  • Optionally, include grading instructions to guide the assessment.

2. Student Participation:

  • Log in as a student, submit a solution for the modeling exercise, and request AI feedback (up to 10 times). To request AI feedback click on the "Request AI feedback" button
image

Check:

  • Each AI feedback request should create a new submission and return feedback.
  • The student should be able to continue modifying the submission even after receiving feedback.
  • AI feedback results must be visible before the due date and assessment due date.
  • Results should appear as GRADED and tagged as PRELIMINARY in the detailed view.

3. Instructor Assessment:

  • Log in as the instructor and begin the assessment process.

Check:

  • Confirm that the latest student submission is retrieved (verify in the participations or scores tab).
  • Ensure that the assessment process works as expected, with the student seeing the final result.
  • Verify that complaint and complaint assessment functionalities continue to work as usual.

Testserver States

Note

These badges show the state of the test servers.
Green = Currently available, Red = Currently locked
Click on the badges to get to the test servers.







Review Progress

Performance Review

  • I (as a reviewer) confirm that the client changes (in particular related to REST calls and UI responsiveness) are implemented with a very good performance even for very large courses with more than 2000 students.
  • I (as a reviewer) confirm that the server changes (in particular related to database calls) are implemented with a very good performance even for very large courses with more than 2000 students.

Code Review

  • Code Review 1
  • Code Review 2

Manual Tests

  • Test 1
  • Test 2

Exam Mode Test

  • Test 1
  • Test 2

Performance Tests

  • Test 1
  • Test 2

Test Coverage

Screenshots

Feedback View

image

Submission History View

image

Screencast of Feedback Generation Flow

Bildschirmaufnahme.2024-09-12.um.23.28.36.mov

Summary by CodeRabbit

Summary by CodeRabbit

Release Notes

  • New Features

    • Expanded feedback request support for modeling exercises, allowing students to receive feedback for both text and modeling exercises.
    • Introduced a new service for handling feedback requests specifically for modeling exercises, utilizing AI-generated suggestions.
    • Enhanced result display logic to filter results for both TEXT and MODELING exercises based on success criteria.
    • Enabled preliminary feedback for modeling exercises before the assessment due date.
    • Improved dynamic feature availability checks based on exercise type in the user interface.
  • Bug Fixes

    • Improved logic to prevent runtime errors related to feedback assignment in the modeling assessment editor.
  • Tests

    • Added new test cases to ensure proper functionality for feedback requests and result handling in modeling exercises.

= Enea_Gore and others added 30 commits August 4, 2024 20:33
Copy link

⚠️ Unable to deploy to test servers ⚠️

The docker build needs to run through before deploying.

@github-actions github-actions bot added the deployment-error Added by deployment workflows if an error occured label Sep 21, 2024
@LeonWehrhahn LeonWehrhahn added deploy:artemis-test1 and removed deployment-error Added by deployment workflows if an error occured labels Sep 21, 2024
@LeonWehrhahn LeonWehrhahn temporarily deployed to artemis-test1.artemis.cit.tum.de September 21, 2024 07:30 — with GitHub Actions Inactive
Copy link
Contributor

@coolchock coolchock left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

re-approve after bug fix

Copy link
Contributor

@EneaGore EneaGore left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

re-approve after bug fix

Copy link
Contributor

@eceeeren eceeeren left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Re-approve after bug fix

Copy link
Contributor

@pzdr7 pzdr7 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reapprove

Copy link
Member

@maximiliansoelch maximiliansoelch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
client Pull requests that update TypeScript code. (Added Automatically!) maintainer-approved The feature maintainer has approved the PR ready to merge server Pull requests that update Java code. (Added Automatically!) tests
Projects
Status: Merged
Development

Successfully merging this pull request may close these issues.

10 participants