this post was submitted on 08 Jul 2023
16 points (100.0% liked)

Programming

17313 readers
164 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities [email protected]



founded 1 year ago
MODERATORS
 

I'm pretty familiar with automated tests where you're comparing a received value to an expected value (e.g. basically all unit/integration tests)


in a CI/CD workflow, you handle test failures by failing the whole pipeline, and then that commit/PR/etc has a pipeline that failed next to it.

However, what if I have some kind of "performance" measure I want to track, instead? Something that isn't pass/fail, but rather a set of experimental results over time? (e.g. speed of responses from an API, wins/draw/loss rates on chess bot, confusion matrix scores for a classifier, etc.) Is there a tool that can show that kind of "automated experiment" results in order by git commit, pull request, etc?

I thought about sending the data to some kind of data store with a Grafana front-end, but I was hoping there might be some less "diy" method for creating such a display.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 2 points 1 year ago (1 children)

Hard to recommend anything without some hint of your build systems. Java via Jenkins? Node via Bitbucket Pipelines? C# via Azure Devops?

[โ€“] GaussianInteger 1 points 1 year ago

My particular use case is actually for a hobby/fun project


developing a bot in Rust to play a game (particularly, Screeps), and I want to track how fast it hits certain game thresholds with each newly developed feature. Gitea Actions for CI/CD, but it's all running on my local network/home lab so I'm happy to shift as needed.