File Exchange
Try Origin for Free
The Origin Forum
Home
|
Profile
|
Register
|
Active Topics
|
Members
|
Search
|
FAQ
|
Send File to Tech support
All Forums
Origin Forum
Origin Forum
Origin graphs for CT-PT test performance metrics
Note:
Only the poster of this message, and the Moderator can edit the message.
Screensize:
640 x 480
800 x 600
1024 x 768
1280 x 1024
UserName:
Password:
Anti-Spam Code:
Format Mode:
Basic
Help
Prompt
Format:
Font
Andale Mono
Arial
Arial Black
Book Antiqua
Century Gothic
Comic Sans MS
Courier New
Georgia
Impact
Lucida Console
Script MT Bold
Stencil
Tahoma
Times New Roman
Trebuchet MS
Verdana
Size
1
2
3
4
5
6
Color
Black
Red
Yellow
Pink
Green
Orange
Purple
Blue
Beige
Brown
Teal
Navy
Maroon
LimeGreen
Forum:
Origin Forum
Subject:
Message:
* HTML is OFF
*
Forum Code
is ON
Smilies
I’m currently preparing for the istqb certified tester performance testing (ct-pt) exam and also working on real-world performance test reports. One area i want to optimize is visualizing test performance metrics (like response times, throughput, error rates, and scalability patterns) using Origin. In tools like JMeter or LoadRunner, I can export CSV logs with metrics over time. Normally, I would use Python/Matplotlib to script multiple plots (e.g., response times across test scenarios, error rate vs. concurrent users, etc.), but I’m trying to replicate this workflow in Origin with more advanced formatting and automation. What I’m trying to achieve: Multi-panel layout: e.g., a 3×3 grid where each panel tracks a different ct-pt test performance metric across test runs. Group Y-series per metric: e.g., grouping response times from different test cases (Login, Checkout, Search) within the same panel. Unified legend across all plots for cleaner reports. Cloneable templates: so I can reuse the same visualization format for different test datasets without rebuilding everything manually. Automated scripting (LabTalk or Python in Origin): ideally I’d like to automate import + plot generation, similar to my Python workflow. Question: Has anyone here automated Origin workflows for software performance testing data before? How would you suggest handling multi-metric grouping and unified legends for ct-pt test style reports? Is it practical to use cloneable graph templates with labels (e.g., scenario names like Login/Checkout) as group identifiers? I think this could also help ct-pt test visualizing testing results in Origin might make studying metrics and patterns easier than raw logs. I’ve also been using practice resources from certshero to map concepts into real data analysis, and adding Origin graphs makes the preparation more hands-on. Would love to hear your experiences or see example workflows![:)]
Check here to subscribe to this topic.
The Origin Forum
© 2020 Originlab Corporation
Snitz Forums 2000