The way I like to think about this is that if software had to be perfect before releasing it, the costs of testing everything is perfect would be too high.
We can reduce the costs by putting measures to prevent things from going too badly. This allows for safe iterations without killing the company on a mistake
A lot of things resonate with me here. One thing I figured out only when I joined a relatively big team is how “what is enough” wildly varies between engineers. For example, when designing a React component, I was tempted to cover only the current requirements, but my PR ended up containing more features for this component than it was asked for.
But I understood why. This component was part of a component library consumed by other libraries. So, to avoid bumping versions across many projects as I introduce even more features, we initially released a bit more versatile component. I’m still not 100% convinced this is the way to go since now we have unused code in the repo.
Everything is about trade-offs. I've worked with a React codebase that had an obvious anti-pattern, but it was okay because we knew what we were doing.
Regarding versioning, to be honest, I don't see a problem with changing the version of a package when there's a real reason. Usually, this is done automatically. But if the API is still unclear, certain trade-offs make sense.
I am one that writes software to silve the current problem. I also believe the design of the software should allow for changes to be made over time. The use of SOLID principles and proper design patterns helps. One of my favorite pattern is the adapter pattern.
Use generics and orchestrators to manage process flow. For every new use case, create a new class (or inherit it from an old class) and apply it using adapters and factories.
For example, a simple ETL process has an extract, transform, and load steps to complete. Most of the time the output of extractor goes to the transformer and its output goes to the loader.
Using an IOrchestrate<T, R> interface and adapters for IExtract<T>, ITransform<T,R>, and ILoad<R> you can build an orchestrator that meets the current requirements, but allow for expansion to new requirements, or if required, versioning current APIs to manage updated requirements. If the process flow changes, new orchestrators can be created using the same interfaces.
Proper use of injection should manage the correct adapter to use for the use case.
I am one that writes software to silve the current problem. I also believe the design of the software should allow for changes to be made over time. The use of SOLID principles and proper design patterns helps. One of my favorite pattern is the adapter pattern.
Use generics and orchestrators to manage process flow. For every new use case, create a new class (or inherit it from an old class) and apply it using adapters and factories.
For example, a simple ETL process has an extract, transform, and load steps to complete. Most of the time the output of extractor goes to the transformer and its output goes to the loader.
Using an IOrchestrate<T, R> interface and adapters for IExtract<T>, ITransform<T,R>, and ILoad<R> you can build an orchestrator that meets the current requirements, but allow for expansion to new requirements, or if required, versioning current APIs to manage updated requirements. If the process flow changes, new orchestrators can be created using the same interfaces.
Proper use of injection should manage the correct adapter to use for the use case.
Code is never done, and that’s okay because the other option is over-engineering, which just makes life harder down the road.
Thanks for the mention, Petar!
The way I like to think about this is that if software had to be perfect before releasing it, the costs of testing everything is perfect would be too high.
We can reduce the costs by putting measures to prevent things from going too badly. This allows for safe iterations without killing the company on a mistake
Thanks for sharing my article! 🙏
Heaving good measurements and KPIs is a crucial.
A lot of things resonate with me here. One thing I figured out only when I joined a relatively big team is how “what is enough” wildly varies between engineers. For example, when designing a React component, I was tempted to cover only the current requirements, but my PR ended up containing more features for this component than it was asked for.
But I understood why. This component was part of a component library consumed by other libraries. So, to avoid bumping versions across many projects as I introduce even more features, we initially released a bit more versatile component. I’m still not 100% convinced this is the way to go since now we have unused code in the repo.
What do you think about this approach?
Thanks for the mention! 🙇
Everything is about trade-offs. I've worked with a React codebase that had an obvious anti-pattern, but it was okay because we knew what we were doing.
Regarding versioning, to be honest, I don't see a problem with changing the version of a package when there's a real reason. Usually, this is done automatically. But if the API is still unclear, certain trade-offs make sense.
I am one that writes software to silve the current problem. I also believe the design of the software should allow for changes to be made over time. The use of SOLID principles and proper design patterns helps. One of my favorite pattern is the adapter pattern.
Use generics and orchestrators to manage process flow. For every new use case, create a new class (or inherit it from an old class) and apply it using adapters and factories.
For example, a simple ETL process has an extract, transform, and load steps to complete. Most of the time the output of extractor goes to the transformer and its output goes to the loader.
Using an IOrchestrate<T, R> interface and adapters for IExtract<T>, ITransform<T,R>, and ILoad<R> you can build an orchestrator that meets the current requirements, but allow for expansion to new requirements, or if required, versioning current APIs to manage updated requirements. If the process flow changes, new orchestrators can be created using the same interfaces.
Proper use of injection should manage the correct adapter to use for the use case.
I am one that writes software to silve the current problem. I also believe the design of the software should allow for changes to be made over time. The use of SOLID principles and proper design patterns helps. One of my favorite pattern is the adapter pattern.
Use generics and orchestrators to manage process flow. For every new use case, create a new class (or inherit it from an old class) and apply it using adapters and factories.
For example, a simple ETL process has an extract, transform, and load steps to complete. Most of the time the output of extractor goes to the transformer and its output goes to the loader.
Using an IOrchestrate<T, R> interface and adapters for IExtract<T>, ITransform<T,R>, and ILoad<R> you can build an orchestrator that meets the current requirements, but allow for expansion to new requirements, or if required, versioning current APIs to manage updated requirements. If the process flow changes, new orchestrators can be created using the same interfaces.
Proper use of injection should manage the correct adapter to use for the use case.