Jira’s Advanced Planning tool (the artist formerly known as Advanced Roadmaps) allows users to manage work across multiple projects, products, and teams. However, if you try to use Advanced Planning at scale, it is impossible to set up plans that give each organizational unit what it needs—a plan that shows their work and how it relates to other teams' work.
Subscribe
Get our best content on how to build better apps.
Subscribe
Get our best content on how to build better apps.
Got product development questions?
Join us on
Discord
Protect Your Business from IT Disasters: Lessons from the CrowdStrike Outage
Businesses heavily rely on IT infrastructure for smooth operations and security, but what happens when that infrastructure breaks? Thousands of companies were blindsided by a sudden CrowdStrike outage on Friday, July 19th, leading to a projected cost of $1 Billion. With an estimated 8.5 million devices impacted by the outage, over 5,000 canceled flights, and hundreds of hospital computer systems offline, you’re probably wondering: What can I do to prevent a CrowdStrike-level business disruption?

Amy Cutlip
Your Next AI Startup Should Be Built on Temporal [Part 3: Automated Prompt Testing]
Welcome to part three of our series on using Temporal to improve the reliability of applications built around LLMs like the one that powers ChatGPT. In part one, you learned how to use Temporal to clone a repo and ingest its documentation into an RAG Database for use with your LLM. Part two taught you how to use context injection to give users more accurate answers to prompts made against that documentation. In this post, you’ll use Temporal and another LLM to automatically test the accuracy of your application’s answers to the prompts from part two.

Marshall Thompson
Developer
CI/CD Pipelines & Use Cases: Everything DevOps Consulting Has Taught Me
Throughout my DevOps consulting career, I've realized that the true value of our work lies in optimizing processes to free up developer time. A key part of optimizing DevOps processes is understanding the difference between manual (on-demand) and automated (triggered) processes.
In this post, we’ll dive into both manual and automated triggers, exploring their roles and integration in CI/CD pipelines. We’ll also tackle the differences between Continuous Delivery and Continuous Deployment, and why one might be a more realistic goal than the other. By understanding these mechanisms and exploring automation opportunities, you can streamline workflows, reduce errors, and boost productivity in your development process.

Max Cascone
The Truth Behind Micro Frontends: Insights from Real Case Studies
When you’re seeking improved manageability for your web application, the idea of breaking down traditional monolithic frontend architectures into smaller, more manageable segments conjures visions of enhanced efficiency and seamless deployment.

Amy Cutlip
CI/CD Tools & Processes: Everything DevOps Consulting Has Taught Me
Continuous Integration (CI) and Continuous Delivery & Deployment (CD) are pivotal concepts in modern software development, facilitating rapid and reliable delivery. By frequently integrating code into a shared repository, detecting issues early, and automating deployments, CI/CD ensures that software can consistently be released to production with high confidence.

Max Cascone
Your Next AI Startup Should Be Built on Temporal [Part 2: Prompt Engineering]
Welcome to part two of our series about using Temporal to improve the reliability of applications built around Large Language Models (LLM) like the one that powers ChatGPT. Part one explained how to build a Temporal Workflow to process a series of documents and make them accessible to your LLM. This post will show how to develop a Temporal Workflow to find documents relevant to a user’s query and supply them as context to a prompt sent to the LLM using Context Injection. You’ll also learn how Temporal's abstraction will make your application more reliable and make it easier for you to extend it with new features.
Kevin Phillips
Director of Backend Development
Your Next AI Startup Should Be Built on Temporal [Part 1: Document Processing]
Taking advantage of the burgeoning AI trend, many of today's applications are built around AI tools like ChatGPT and other Large Language Models (LLMs). AI-optimized applications often have complex software pipelines for collecting and processing data with the LLM. Temporal provides an abstraction that can significantly simplify data pipelines, making them more reliable and accessible to develop. In this post, you’ll discover why you should use Temporal to build applications around LLMs.
Kevin Phillips
Director of Backend Development