Skip to content

GitLab

  • Menu
Projects Groups Snippets
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
  • Sign in / Register
  • dav1d dav1d
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
  • Issues 20
    • Issues 20
    • List
    • Boards
    • Service Desk
    • Milestones
  • Merge requests 12
    • Merge requests 12
  • CI/CD
    • CI/CD
    • Pipelines
    • Jobs
    • Schedules
  • Deployments
    • Deployments
    • Releases
  • Wiki
    • Wiki
  • Activity
  • Graph
  • Create a new issue
  • Jobs
  • Commits
  • Issue Boards
Collapse sidebar
  • VideoLAN
  • dav1ddav1d
  • Issues
  • #209

Closed
Open
Created Nov 30, 2018 by Ewout ter Hoeven@EwoutHContributor

Setup automated performance testing (and integrate GitLab CI)

Since improving performance is the whole reason dav1d is being build automated performance testing and monitoring should be usefull. I wish I would be able to implement this myself, but here is the general direction in which I'm thinking.

  1. Determine a number of test we want to run automated. Probably a general performance test on different platforms (arm64, x86) with different SIMD extensions (depends somewhat on #198 (closed)), a scaling test (performance over 1 to n threads) and a profiling test (to determine which functions improved/degraded). Please let know what (else) would be usefull.
  2. Encode or get files to perform these tests.
  3. Get a very consistent hardware platform. This could be dedicated on existing systems (tests will be run overnight), but it could also be cloud based. Since Google, Amazon and Microsoft are all AOM members they might be willing to help. Fixing clock-speeds is extremely important since we have had enough Turbo woes (#101 (closed)).
  4. Get a stable test software configuration. Stable Linux, ivf files on a (persistent) SSD, the needed test packages installed.
  5. Write some test instructions/code/commands to run tests and output/save usefull data.
  6. Trigger this platform daily to run these performance test. Also enable a option to trigger it manually for Merge Request that you suspect will impact performance.
  7. Upload the results to some database. Send push notifications if performance differs more than some margin of error (2% or so).
  8. Build some fancy graphs and statistics from these database
  9. If possible, integrate with Gitlab CI.
To upload designs, you'll need to enable LFS and have an admin enable hashed storage. More information
Assignee
Assign to
Time tracking

VideoLAN code repository instance