News
[Nov 20, 2020]: Currently, the
ICPE 2021 conferecne and
therefor the LTB Workshop model is
on-site with remote
participation facilities
- meaning the aim is to have a live event
but along with required infrastructure for online presence.
Call for papers
Software systems (e.g., smartphone apps, desktop applications, telecommunication infrastructures, cloud and
enterprise systems, etc.) have strict requirements on software performance. Failure to meet these requirements
may cause business losses, customer defection, brand damage and other serious consequences. In addition to
conventional functional testing, the performance of these systems must be verified through load testing or
benchmarking to ensure quality service.
Load testing and benchmarking software systems are difficult tasks, which requires a great understanding of the
system under test and customer behavior. Practitioners face many challenges such as tooling (choosing and
implementing the testing tools), environments (software and hardware setup) and time (limited time to design,
test, and analyze). This one-day workshop brings together software testing researchers, practitioners and tool
developers to discuss the challenges and opportunities of conducting research on load testing and benchmarking
software systems.
We solicit two tracks of submissions:
- Research or industry papers (maximum 4 pages)
- Presentation track for industry or research talks (maximum 700 words extended abstract)
Research/Industry papers should follow the
standard
ACM SIG proceedings format
and need to be submitted electronically via
EasyChair. Accepted papers
will be published in the
ICPE 2021 Proceedings.
Submissions can be research papers, position papers, case
studies or experience reports addressing issues including but not limited to the following:
- Efficient and cost-effective test executions
- Rapid and scalable analysis of the measurement results
- Case studies and experience reports on load testing and benchmarking
- Leveraging cloud computing to conduct large-scale testing
- Load testing and benchmarking on emerging systems (e.g., machine learning systems, adaptive/autonomic
systems, big data batch and stream processing systems, and cloud services)
- Continuous performance testing
- Load testing and benchmarking in the context of agile software development process
- Using performance models to support load testing and benchmarking
- Building and maintaining load testing and benchmarking as a service
- Efficient test data management for load testing and benchmarking
- Context-driven performance testing
- Performance/load testing as an integral part of the performance engineering process
Important Dates
Paper Track (research and industry papers):
Abstract submission: |
January 3, 2021; |
Paper submission: |
January 10, 2021; |
Author notification: |
February 1, 2021; |
Camera-ready version: |
February 22, 2021; |
Presentation Track:
Extended abstract submission: |
January 31, 2021; |
Author notification: |
February 15, 2021; |
Organization:
Chairs:
Alexander Podelko |
MongoDB, USA |
Tse-Hsun (Peter) Chen |
Concordia University, Canada |
Hamzeh Khazaei |
York University, Canada |
Program Committee:
Almeida, Eduardo Cunha de |
UFPR, Brazil |
Jiang, Zhen Ming (Jack) |
York University, Canada |
Kroß, Johannes |
Fortiss, Germany |
Sunyé, Gerson |
University of Nantes, France |
Li, Heng |
Polytechnique Montréal, Canada |
Chen, Jinfu |
Huawei, Canada |
Shang, Weiyi |
Concordia University, Canada |
Steering Committee:
E. Hassan, Ahmed |
Queen’s University, Canada |
Litoiu, Marin |
York University, Canada |
Jiang, Zhen Ming (Jack) |
York University, Canada |
Past LTB Workshops: