Javascript on your browser is not enabled.

Lakshita Sharma

Lakshita Sharma

Senior SDET, Cvent

Biography

Hi everyone, I am Lakshita Sharma, born and brought up in India’s Capital, New Delhi. I did my graduation in Computer Science and Engineering from GGSIPU. I have been in the world of automated technology from more than 3 years. I have worked on performance and quality engineering in projects involving payments gateway and event cloud. I have been associated with Cvent for more than 2 years and have worked on different aspects of testing, expertised myself in Java, Jmeter , Gatling , Sitespeed and various monitoring tools.

On a personal front, I enjoy writing poems and reading a lot of fictions . I love exploring the unexplored things.

I am also a technology geek and have always found a kick in learning new technologies, slicing it apart and building things out of it. The kind of things we can build with existing technology is nothing short of magic to me. I also try to follow startups and businesses trying to understand their business models.

Shift-Left Performance Engineering with CI/CD
Performance is a critical non-functional quality aspect in software development. Performance issues discovered at a later stage in development cycle can lead to more significant problems down the road, where resolution can be expensive and time-consuming. RECOMMENDED SOLUTION. Continuous integration and delivery coupled with automated performance tests will enable finding performance issues while code is being developed. It is a faster and cost-saving approach for releasing new product features and enhancements with minimal performance risks drastically reducing time and efforts. As we progress towards shifting left, it becomes crucial to ensure that we find functional and non-functional issues early in the release cycle. At Cvent, we have implemented an end-to-end automated process for continuous performance engineering, minimizing the monitoring time to check for any degradation. Before going to production, code is deployed to the pre-production region, where the load tests are executed and analyzed to check the performance bottleneck in nightly regressions. The process includes data setup at runtime using test Data Management, test execution, and report generation using Gatling. As a part of release certification, we have scheduled load regression test runs via Slackbot, which can be triggered with either Auto-Abort or Notify-Only functionality and thereafter a notification is sent. Before the test kicks off, the system checks if auto-scaling is enabled, then accordingly scale up the services depending on the service utilization metrics. When anyone executes the load tests from Jenkins Pipeline, results/metrics are collected just after the completion of load tests. A consolidated report detailing graphical data analysis and performance metrics for Gatling, Datadog, and Splunk is generated once the test is completed via a customized reporting framework.

View Ekta Khatana and Lakshita Sharma’s video