Why Complex Systems Require Modern Engineering Tools
We live in an increasingly complex world. Data, software, AI, machine learning, internet connectivity — these all make the products we create and consume more intricate and complicated than ever.
This ever-growing complexity requires modern engineers to be both flexible and adaptable in their system design. For the typical discipline-specific engineer, this can be a challenge.
However, complexity is only going to increase. Engineers that adapt their approach will retain a competitive edge. Those that don’t will fall behind. Specific to hardware manufacturers, this requires embracing cloud native platforms, digital modeling, virtual simulations, and resource-light approaches.
By doing this, engineers can create technology that is more usable, sustainable, viable, and human.
But first, let’s take a closer look at why systems have become so complex in the first place. Then we can discuss what engineers should do in response.
Why are systems increasing in complexity?
The transformation of software, data, and real-time connection has made once simple systems exponentially more complex. Understanding these developments is key to creating an adaptive engineering approach.
Software-defined everything (SDx)
Uncertainty has defined the last two years, particularly when it comes to supply chains. In the face of hardware limitations, manufacturers have turned to software as a means of providing additional value.
Key trends like software-defined networking (SDN), software-defined storage (SDS), and software-defined data centers (SDDC) are part of a broader movement known as software-defined everything (SDx). Rather than redeploy new hardware anytime a manufacturer wishes to innovate, they can simply deploy software updates across their entire fleet.
In essence, SDx takes the idea of continuous improvement and extends it throughout the entire lifecycle of a particular product.
The global SDx market is forecasted to reach $160.8 billion by 2024 from $51.7 billion in 2019, growing at a compound annual growth rate of 25%. One example of this is Volkswagen, which expects up to 33% of its revenue to come from software by 2030.
This growth shouldn’t be surprising, given the major benefits of an SDx approach:
- Optionality. Manufacturers provide slightly more expensive, memory-rich chips on the outset. This, in turn, enables increased software optionality updates throughout the life of the product, increasing the value to the customer
- Flexibility. Whether companies are facing supply chain shortages, corporate mergers, acquisitions, or spin-offs, SDx enables greater flexibility to adapt to changing environments
- Scalability. SDx amortizes the cost over multiple programs and products, resulting in a cost-effective approach that eases the complexities of IT infrastructure, networks, and storage
Physical items and functions within our daily lives are becoming more software-defined: from lightbulbs to smartphones to vehicles — and everything in between. As such, the value proposition of these products are changing.
SDx abstracts vital hardware and infrastructure to help organizations better manage operations, maintenance, repairs, and upgrades. As such, companies are better able to achieve economies of scale and minimize costs.
Additionally, the boundaries are becoming blurred between enterprises and service providers, and traditional roles of customer, technology provider, and service provider are all in the air. This makes the once simple takes of market segmentation infinitely more complex. A single company may now play multiple roles — and even compete against itself using the exact same hardware.
Exponential growth in Big Data and AI
The explosion of available data over the last decade should come as no surprise to anyone.
The amount of data generated by IoT devices is expected to reach 73.1 ZB (zettabytes) by 2025.
This growth is primarily driven by the fact that the core technology underlying most devices has changed. Just two decades ago, vehicles never sent or received data. This meant that engineers could not base their next models on information received from previous generations. The only way to determine the quality of a model was by testing a prototype.
Now, everything has changed. Auto engineers, for example, now have access to fleet-level data, which means there’s a continuous influx of information based on real-life experiences their vehicles are facing.
Using this data as a foundation, engineers can build virtual models that they’re continually testing and improving based on new data that comes in from the fleet. One example of this is Tesla, who leveraged a decade of fleet data and released a software update on the motor controller. The result: 30% more range on the vehicle.
Additionally, machine learning and AI enables virtual simulations, which are both faster and resource-light than traditional prototype development. This means that engineers can test millions of scenarios before they even make it to a prototype. All of this is only possible because of the data at their disposal that informs these simulations.
Simply put, more than 3.2 billion people are online with an estimated 20 billion connected sensors and devices. The influx of data is a major boon to engineers, but without machine learning and AI models to build and train algorithms, it’s impossible to turn this data into actionable insights.
As AI and machine learning continue to develop and techniques become more sophisticated, engineers will only have more optionality when it comes to testing and developing products-before building a single physical component.
Always on and always connected
A defining feature of our world today is the fact that we’ve never been more connected. Our personal devices, cars, houses, and virtually every aspect of the outer world is connected. The IoT has successfully bridged the gap between customers, businesses, and their physically connected devices.
Think of your cell phone, for example. That simple device that has become so inundated into our lives is always connected to the manufacturer. That manufacturer, in turn, is constantly pushing new updates to the device. While these are mostly security updates, every so often a new feature will come through.
Always on/always connected means that manufacturers have an unprecedented ability to add features and functionality to devices that are already in-market. This means that each model doesn’t have to have full functionality to go to market-you can always add that in later. Speed to market has become the norm.
What’s more, putting chips and digital connections into your products means that you’re creating value that you don’t even know exists yet. As such, manufacturing and product development will always be a step ahead of the engineer’s ability to fully realize those developments.
Distribution of engineering teams
In a post-COVID world, the primacy of fully on-site engineering teams is going by the wayside. 86% of engineers are fully remote. While there are significant benefits to a remote workforce — namely, access to a wider, global talent pool — there are challenges as well.
Specifically, fully remote engineering teams face challenges when it comes to collaboration and transparency in work. Without the proper digital infrastructure in place, there can be misalignment, redundancy, and points of failure. All of these can be costly, even dangerous.
Cloud native collaboration tools provide a solution to this challenge. By placing all your engineering work into the cloud, you can provide space for teams to work on the same project asynchronously. This enables transparency and multiple sets of eyes to review work, which can ensure that you don’t make simple mistakes early in the process.
How should engineers adapt to this changing complexity?
Now that we’ve laid out the problem of increasing complexity, what should you do about it? First, let’s talk about what you shouldn’t do: you shouldn’t bemoan the changes and hope things go back to “the way they were.” That’s never going to help.
Nor should you try and force fit old ways of working into new scenarios. For one thing, the younger the engineer talent pool becomes, the less relevant those old methods are going to be. Additionally, failure to adapt to modern engineering best practices means that you’re missing out on some tangible benefits:
- Faster time to market
- Reduced costs
- Greater quality of product
- Continuous improvement — including on your in-market products
- Higher customer satisfaction
As such, it’s important that you invest in processes and technologies that allow you not only to survive these changes, but thrive in the midst of them.
Digital modeling and model based development (MBD)
Digital modeling and model based development is now widely adapted among companies developing modern day systems. Digital modeling is highly beneficial because all data related to a system is co-located with its model and used for the design, development, and management of the respective product. Because all the data is in one central location, engineers can run simulations more quickly and efficiently, build products faster, and reduce time to market.
As hardware engineers gain access to more high quality data, digital modeling has become increasingly sophisticated and has led to a number of benefits:
- Simplified system design. Digital modeling helps streamline system development with user-friendly integrated development environments enabled to validate conformity of components to characteristics and requirements. This results in highly accurate design and process reliability
- Improved visualization. Digital modeling allows owner-operators and Engineering, Procurement and Construction (EPC) companies to visualize the final plant better and gives them the ability to make intelligent, data-driven decisions and forecasts
- Enhanced designer output. By removing the manual and tedious parts of designs, digital modeling helps designers visualize all components during the initial design stage, reducing the number of iterations and improving productivity
- Increased deliverable quality. To aid in meeting high-pressure global demands, digital modeling empowers companies to improve deliverable quality under tight deadlines and budgets
- Asset management. Digital modeling enhances your data management, enabling better management of assets, standard procedures, automated processes, and increased output speed and accuracy
Virtual simulations
When designing a deterministic system, it’s easy to employ “if-this-then-that” logic. Autonomy, however, cannot rely on IFTTT, simply because there are potentially billions of edge cases to account for.
This requires AI and machine learning-based algorithms to train themselves over billions of runs. Physically, it is impossible to achieve this level of sophistication. Virtual simulations, on the other hand, can accomplish this in just a few seconds per simulation.
By running virtual simulations, AI can test and learn iteratively, enabling you not only to reduce time to market, but also make continuous improvements over the whole product lifecycle.
We should reiterate the importance of co-locating your data with your model. Without that critical component, virtual simulations simply take too long to run. Not only that, but you want your system to ingest as much data and train the model as quickly as possible, to deliver positive results faster.
Cloud native collaboration
Increasing complexity in both systems and teams necessitates new tools, technologies, and platforms to enable engineers to work efficiently and productively. Without access to cloud native collaboration, work becomes redundant, and there aren’t multiple points of failure in the verification and validation process.
While MVP is a good concept for app development, compliance, safety, and regulatory practices for many hardware products such as cars and passenger planes make it a non-starter. Cloud native collaborative platforms not only ensure full transparency into all data related to the project, but allow engineers to iterate and create feedback loops.
Additionally, cloud native collaboration enables teams across the entire lifecycle of the product full visibility from Day One. This enables teams to solve issues in real time, reducing backlog and bring products to market faster.
Resource-light approaches
Prototyping is a resource-heavy process that is limited in the value it can offer. Instead of using it as your first line of defense, it should be the last line of defense — addressing those issues that you can only work out in the real world.
The sophistication of digital modeling and virtual simulations means that traditional prototyping methods are not only unnecessary, but they’re a bad investment for companies. Instead, use these innovations as an opportunity to find other areas to reduce the resources you invest in product development.
By reducing time-to-market and the costs involved in production — without sacrificing product quality — you’ll make everyone from consumers to investors much happier.
Data edge processing
As we mentioned earlier, the amount of data being created is growing exponentially. A growing concern among experts is that storage capacity isn’t keeping apace with data growth.
This is where data edge processing is going to play a major role. Essentially, it’s the concept that data must be collected, processed, and managed in real time, close to where the data is generated. While generally used in reference to IoT, edge processing applies to other data sources like log files, network flow data, and more.
As such, it is increasingly critical that as the data surrounding your products balloons, you have the systems in place to process that data, convert it into usable information, and apply it to current and future projects in real time. Otherwise, you simply won’t be able to keep up.
What solutions are available to engineers to respond to this complexity?
If you’re concerned about the tools and platforms at your disposal currently, the good news is that there are solutions available to you. Specifically, Collimator comes ready-made with the features and functionality to respond to this increasing complexity:
Collimator is also designed for the engineer. Our visualization-based design platform for both hardware and software means that you don’t have to write a single line of code to go from concept to a fully functional model. From there you can run simulations, test, and iterate-and build your prototype with the confidence that your product is market-ready.
Learn more about Collimator and the benefits we offer engineers and try today for free.
Originally published at https://www.collimator.ai.