Observable Lightning Talks September 2023
Observable Lightning Talks is back with new speakers for the 4th edition. Come and join us to hear tips and tricks from Industry Experts.
Observable lighting talks open the stage to three speakers sharing their experiences, recommendations, best practices, and more.
And if you have any questions for our speakers, you can ask them during the live panel discussion.
Come and join us on the 20th of September for the 4th edition of Observable Lighting Talks.
The event will be broadcast live on Linkedin, Twitch, Facebook, and of course Youtube.
Observability and SRE practices rely on many frameworks available on the market. However, since technology evolves at a very fast pace, it can be difficult to keep track of the latest solutions and technologies and implement all the best practices.
To help you improve your practices and understand the best of breed of the latest technology, IsitObservable is launching a new Show: Observable Lightning Talks.
The concept is simple: Thought leaders from the observability industry will help us along in our technology journey.
Each Observable Lighting Talk will host three speakers covering three different topics that we all love:
The 4th edition of the Observable is planned on the 20th of September between 12PM - 1:30PM EST ( 6PM-7:30PM CEST)
Here is the Agenda for the first edition :
The 3 presentations of the 3rd edition are :
Distribute tracing in event-driven architecture
Tamini, Developer Advocate at Solace
What is Distributed Tracing in the context of event-driven architecture and how it can be implemented
Enterprise-wide observability- trends and blockers
Angelika Heinrich, Project Manager at Mainstorconcept
As an enterprise mainframe expert, my work brings me close to SRE'S, Architects, Product Owners and Technical Team Leads at Fortune 100 companies and the top 50 banks in the world. Although each organization's problems with observability seem different at first glance, my team and I observed 2 overarching trends as well as common blockers that hinder system-wide visibility that may help you identify the optimal strategy for enterprise-wide at your organization.
The Art of Macro Benchmarking using Cloud Native Observability
Bartek Plotka, Senior Software Engineer at Google
Benchmarking is hard, especially on a macro level that stress tests multiple code components or microservices. It’s challenging to reproduce production conditions that matter, isolate the test from external effects and reliably automate all steps. This leads to unreliable benchmark results, a long time to execute benchmarks and a high cost of stress tests, so generally wasted engineering time and business money.