Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Evaluator Submissions List Page Updates #393

Open
24 of 50 tasks
r-bartlett-gsa opened this issue Feb 6, 2025 · 0 comments
Open
24 of 50 tasks

Evaluator Submissions List Page Updates #393

r-bartlett-gsa opened this issue Feb 6, 2025 · 0 comments
Assignees

Comments

@r-bartlett-gsa
Copy link
Member

r-bartlett-gsa commented Feb 6, 2025

User Story

As a challenge manager, in order to accomplish all tasks associated with my challenge management in a consistent way, I would like evaluation panel workflows to be consistent in design and styles with other eval site features.

Acceptance Criteria:

  • When the challenge manager clicks on an evaluator name or on number of submissions link from the evaluation panel page, they are displayed a list of challenge submissions that are assigned to the evaluator
  • Page H1 is Challenge Title (if it's a phased challenge, the phase is indicated in the H1)
  • Page H2 is "Submissions Assigned to [Evaluator Name] - [Evaluator Email]"
  • Submissions Stats Dashboard is displayed, Submissions Stats Dashboard Updates #394
  • The submissions table consists of the following headers:
    • Submission ID - links to the submission detail page
    • Evaluation Status
    • Score - If evaluation is completed, a link to view evaluation is available
    • Action - Unassign button
    • Unassigned submissions are displayed at the bottom of the table with greyed out background
      • Recused and unassigned submission cannot be reassigned
  • If an evaluator recused from any of the submission, a recused evaluation alert is displayed above the table until evaluator is unassigned by the challenge manager
  • Info alert is displayed for how to assign evaluators to submissions and a link to the submissions list page for that challenge
  • Designs (colors, fonts, labels, etc) match wireframes in Figma: https://www.figma.com/design/sLkJM1Ua9Zu3HW2h4mVdgr/Challenge.gov-Eval-(Updated-01%2F27%2F25)?node-id=7003-153455&t=G2WnCXJa1OXhZSU3-1

Definition of Done

Doing (dev team)

  • Code complete
  • Code is organized appropriately
  • Any known trade offs are documented in the associated GH issue
  • Code is documented, modules, shared functions, etc.
  • Automated testing has been added or updated in response to changes in this PR
  • The feature is smoke tested to confirm it meets requirements
  • Database changes have been peer reviewed for index changes and performance bottlenecks
  • PR that changes or adds UI
    • include a screenshot of the WAVE report for the altered pages
    • Confirm changes were validated for mobile responsiveness
  • PR approved / Peer reviewed
  • Security scans passed
  • Automate accessibility tests passed
  • Build process and deployment is automated and repeatable
  • Feature toggles if appropriate
  • Deploy to staging
  • Move card to testing column in the board

Staging

  • Accessibility tested (Marni)
    • Keyboard navigation
    • Focus confirmed
    • Color contrast compliance
    • Screen reader testing
  • Usability testing: mobile and desktop (Tracy or Marni)
  • Cross browser testing - UI rendering is performant on below listed devices/browsers (Tracy or Marni)
    • Windows/Chrome
    • Windows/Edge
    • Mac/Chrome
    • Mac/Safari
    • iOS/Safari
  • AC review (Renata)
  • Deploy to production (production-like environment for eval capability) (dev team)
  • Move to production column in the board

Production

  • User and security documentation has been reviewed for necessary updates (Renata/Tracy/Dev team)
  • PO / PM approved (Renata)
  • AC is met and it works as expected (Renata)
  • Move to done column in the board (Renata)
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants