Joost van der Vlies: What Tech Leaders Should Know About Software-Defined Logistics

Disruptions are commonplace in the supply chain and logistics industries. Thankfully, technologies such as Software-Defined Logistics (SDL) offer game-changing solutions to obstacles in the supply chain. We speak with Joost van der Vlies, CTO and Head of Architecture at PostNL, on the benefits and challenges of implementing SDL, as well as important insights on cloud technology in the logistics space.  

 

SDL is an emerging approach for EDGE computing and PostNL is one of the pioneers in this area. How do you define the use of SDL for enterprises? What benefits does it bring?

SDL is all about using data and algorithms to steer the supply chain in all its aspects, from forecasting, planning, execution, monitoring, communication, and making real-time decisions automatically. For example, our network setup before determined the physical flow of a logistical item (e.g., parcel), now it is the digital twin of that item that determines the physical flow through our third-party networks. The digital twin contains not only the metadata of the parcel and the order but also customer and operator preferences which can be updated in real-time. For example, deciding on the sorting belt to change the operator from home delivery to retail delivery as the consumer updated their delivery preferences, or to change the operator from bicycle delivery to truck delivery as the item was much heavier than communicated. This means SDL is about sense, deciding, and responding, which makes logistics much more flexible and dynamic. Interestingly, this also creates a lot of new data, which can be leveraged in ways not thought of before. 

 

For PostNL, how are you effectively utilizing SDL as part of your cloud strategy? What is the framework and how can CTOs apply it to their organization?

Our cloud strategy is a multi-cloud strategy comprising SaaS, PaaS, and IaaS service providers, and a strong connectivity layer that also includes Edge environments. SDL is part of a more digitized business and cloud is the de facto delivery model for digital business. Within our cloud strategy, the emphasis is on cloud-native component-based application architectures, which can automatically scale depending on the logistical volume and can take part in the sense and response patterns that SDL requires. We train our machine learning models in the cloud and deploy them where decisions are made, that can be both in the cloud or on the edge. As response time and throughput are essential factors, we use global tier 1 internet service providers that provide abundant capacity, maximum uptime, and truly global coverage (for our international business), and private network partners where necessary. 

 

Of course, with any emerging technology, there are challenges and obstacles. Currently, what are the main challenges that tech leaders need to be aware of with SDL?

One of the challenges is that not all existing applications have been designed to operate in a real-time use case, so temporary measures might be necessary as well as a structural re-architecture. Here’s another example — when using machine learning models, it can have a more complex deployment model having an AI platform develop and re-train the model, and have it embedded in an algorithm in or near the business application it is used. And with SDL events can occur from a multitude of actors, which need to be handled in a highly scalable rule engine and using a single source of truth state machine of logistical items. Tech leaders should also be aware of the impact of the business operations on the people working thereof which their work will be impacted. Business and IT should jointly work on SDL and have a change management process from the start. 

 

How did PostNL overcome these pitfalls? What can other CTOs learn from your approach in tackling challenges?

Regarding algorithms, in the past years, we moved from data science hypothesis projects to the development of algorithms with learning models for use in production. That is only possible in a multi-disciplinary approach combining data scientists, data engineers, and the DevOps teams where these algorithms will run or with which it will be integrated. We were not afraid of taking a high-profile initial case and started working on this, learned from it, and eventually earned a computable award in 2020 with an algorithm that predicts when a parcel will really arrive. This is the same for the real-time data case. It requires a multi-disciplinary approach, time, and capacity for innovation, as it has a lot of consequences not all immediately known from the start. 

 

While cloud adoption is gaining momentum, there is still hesitancy among enterprises to fully adopt it. What should the approach be for CTOs to encourage technologies such as cloud and SDL within the organization?

Technology is not an island. Technology supports businesses to become successful. The processes of our customers and our own are getting more and more digital, and increasingly we do business with applications and machines instead of human interaction. Yes, an API is a technical way of accessing data and functionality, but in essence, it is a 100% digitized business service. Together with high volumes, the increasing number of digitized actors in our ecosystem, and the increasing flexibility our customers are asking in the e-commerce domain, cloud and SDL are essential capabilities for digital business. 

 

Finally, what advice can you give to other organizations that are starting to invest in cloud technology? What are the common mistakes that CTOs should avoid when making their transition into the cloud?

Firstly, cloud is not an infrastructure play. It is a full-stack play and includes, or starts with, an application strategy. Rehosting only will not provide true benefits. Understanding business drivers and the requirements for the applications in the future will be input for decisions to buy, consume or make. It also influences decisions to retire, replace, or re-architect those applications, which has direct consequences on the cloud strategy and roadmap. 

Next, skills around networking, storage, and high-performance computing are important and still very relevant when moving to cloud. You should continue having these skills onboard to avoid problems in the long run.  

In addition, the term multi-cloud is used a lot in the industry, though it is much more than using two or more public clouds. For us, any service a partner provides through the Internet is a cloud. This multi-cloud has to be managed from an overall functional, technical, and multi-supplier perspective. Lastly, when starting from a pure on-premises environment, the current IT department setup will probably not be aligned to cloud. Therefore, setting up a cloud competence centre is crucial.  

*The answers have been edited for length and clarity. 

The Retail Landscape in 2022: What’s in Store?

Consumer shopping habits have changed drastically in the past two years. Retailers from all business environments transformed to accommodate fluctuating customer expectations. The adoption of retail tech has also increased among brands to not only survive but thrive during these uncertain times. As the year comes to a close, what key retail trends are expected in 2022?  

 

Shifting Consumer Priorities and Habits 

Customer personalization 

According to Gregory Ng, CEO of Brooks Bell, today’s customers want to connect with their favorite brands smoothly on their own time. As more retailers provide an omnichannel experience, customers expect to receive quick, responsive, and personalized engagements throughout their buying journey. Several initiatives brands have taken include offering personalized product recommendations based on previous purchases, designing quizzes to ensure the product is tailored to individual customer needs, and gifting customers coupons during their birthday month. Whether retailers have a presence online, in-store, or both, a personalized shopping experience yields positive results. A study by Boston Consulting Group found that customers who experienced high personalization were 110% more likely to add more items to their carts and gave higher net promoter scores.  

Sustainability

Results from PwC’s December 2021 Global Consumer Insights Pulse Survey shed light on the increasing influence of sustainability on purchasing decisions. Consumers have become more eco-friendly compared to six months ago, and 51% of respondents take sustainability into consideration when buying a product, alongside pricing and convenience. Retailers are responding accordingly, introducing greener products, processes, and services. At Systembolaget, more organic products in climate-smart packaging are on their shelves due to customer demand. “It is about designing your channel and products in the most sustainable way and thinking it through from start to finish,” says Systembolaget CEO, Magdalena Gerger. A growing number of companies, from fashion to furniture, are also embracing the circular economy to reduce waste and extend product lifecycles.  

Smartphone shopping

Smartphones have made the online shopping experience seamless, accessible, and convenient. In PwC’s December 2021 Global Consumer Insights Pulse Survey, 41% of respondents state that they shop daily or weekly on their smartphones. For e-commerce sites and retailers, websites and apps are their shopfronts and they need to be mobile-friendly to boost sales and engagement. As smartphone usage is high among consumers, retailers must rewire their marketing efforts for the highest visibility in a mobile environment.  

 
Seize networking opportunities with Europe’s top retail leaders at IndustryForum Retail in the Netherlands.
 

Rethinking Retail Marketing Strategies 

An optichannel approach

Retailers who want to elevate customer experience levels should consider an optichannel strategy. By enhancing existing omnichannel marketing with statistics and user insights, retailers have a strong framework to improve automation, resource allocation, and communication with customers. The transformation from omnichannel to optichannel can be made possible with immediate steps such as mapping out customer journeys, quickening social media response time, and repurposing content.  

Customer data utilization

The more consumer data retailers collect, the better it is for business. But where should brands draw the line? Data privacy and protection must always be a priority for retailers and they need to strike a balance between collecting the data needed and making customers feel safe. However, finding this balance may be challenging with Google’s decision to end third-party cookies by 2023 and Apple’s new privacy policies. Retailers will now have to find creative ways to collect and leverage first-party data for marketing purposes.  

 
Join now: Learn about the newest tech in retail at our exclusive events, StrategyForum Retail and E-commerce in Denmark and European StrategyForum Retail in the Netherlands.
 

2022 Technology Trends in Retail 

AR and VR 

Augmented and virtual reality technologies provided solutions to create in-person shopping experiences during a time when customers preferred not to leave their homes. Fashion and beauty retailers were quick to adopt AR and VR, allowing customers to virtually try on their products before buying. Earlier this year, H&Mbeyond announced their collaboration with NeXR Technologies to develop a virtual fitting room. A personal avatar of a customer is created with the help of body scanning, giving them the freedom to adorn their avatars in different outfit combinations on an app before committing to a purchase. Virtual fitting rooms have the potential to increase online conversion rates and reduce returns

Sensor technology

The ubiquity of smartphones has paved the way for sensor technology in retail. This technology is a game changer for physical retailers where consumer data collection is slower than their e-commerce competitors. With sensor technology, physical retailers can gain valuable customer information to enhance experiences and improve marketing efforts. For instance, scan & go apps allow customers to purchase and pay for items on their mobile phones, making their shopping trips fast and secure. Retailers can also install sensor technology inside store floors to track customer paths and determine the most-visited aisles and shelves.  

Headless commerce  

This system gives brands the flexibility to build composable and customizable applications to fulfill their customers’ needs. Companies that embrace headless commerce can launch and optimize new updates quickly as the front-end and back-end of an application are separated. Furthermore, headless commerce ensures the compatibility of a brand’s website across all viewing formats and devices. Even without a huge budget and experienced development teams, companies can utilize these headless commerce solutions to modernize their shopping platforms.

 

As the retail and e-commerce space gets more crowded, brands have to consistently find innovative ways to stand out and remain profitable. Leveraging the right technology and strengthening customer relationships are the building blocks for a resilient and sustainable retail business.  

What Does the Future of Cloud in Europe Look Like?

Cloud technologies have been catalysts for growth, innovation and agility for data-driven organizations across Europe. How do IT leaders ensure that their organizational cloud-based environments are scalable, effective and comply with relevant data privacy regulatory laws?  

Daniel Melin, Strategist at Skatteverket; and Kaj Kjellgren*, Senior Network Architect at Netnod Internet Exchange; help us navigate the current volatile cloud landscape and provide answers to important questions on cloud security, compliance, and challenges. In addition, we hear about the roles they play in the highly anticipated and talked about cloud project, Gaia-X.  

 

How can businesses ensure effective cloud data protection?  

Daniel: Customers need to choose cloud services that are sufficiently secure for their information. When evaluating security, the customer needs to take the whole spectrum of security into account; physical, IT, information, legal, and political. Security is like a chain and every link has to be evaluated

The Swedish Tax Agency has established a cloud center of excellence consisting of experts in IT security, legal, data protection, document and archiving, physical security, procurement, and architecture to make sure that all aspects are looked at before a new cloud service is enabled for users. 

Kaj: Protection of data must be based on an initial categorization of the data itself and identification of requirements on each data element. Not every piece of data requires the same protection. Of course, there are legislations and traditional security requirements that have to be followed.

For information security, this normally comprises availability, correctness, and confidentiality. If you start from zero, orchestrated microservices are the easiest way of ensuring adequate protection using the zero trust concept to isolate the various containers touching the data. Once again, this has to be according to the defined requirements for each data element. This orchestration, often called cloud, can be self-hosted or hosted by third parties, just like any service an organization needs.  

 

What are the biggest challenges concerning compliance with cloud data protection regulations and laws? 

Daniel: There are direct challenges with laws like the Swedish Public Access to Information and Secrecy Act (offentlighets- och sekretesslagen) and the GDPR. Both are challenges for Swedish public sector customers today. However, the Protective Security Act will be the hardest law to comply with, especially when a non-Swedish cloud provider has access to huge amounts of aggregated information. 

Kaj: The main legal challenge for any IT-related issue since 1990 is that legislation is different in different jurisdictions. The market economy pushes for large specialized organizations, services, and products that are bigger than any jurisdiction. This has hurt the flow of money and created tax havens for a number of years. A similar situation now exists for services. 

Those rules made by politicians with imaginary borders do not comply with the foundation of the Internet, which was made by technicians and engineers to be open, free, and unlimited by country borders between jurisdictions. On top of that, no single economy today is large enough to produce services for that economy alone without having to scale impact price for production. 

 

Tell us about your role in the Gaia-X project. 

Daniel: The Swedish Tax Agency currently has an assignment from the Swedish government to monitor Gaia-X. That work includes talking to all relevant stakeholders, gathering information, presenting at conferences, and taking part in the Swedish hub. We are positive about Gaia-X and what it brings to the table. 

Kaj: Netnod is one of the founding members of Gaia-X in Sweden, and together with similar organizations helps with basic services like transport which are needed for players higher up in the value chain. We are currently most active in the Sub-working group Interconnection & Networking which lies under the Architecture Workgroup within the Technical Committee under Gaia-X AISBL. 

 

What role does the human factor play in cloud security and vulnerability? 

Daniel: The human factor is as relevant as always; I don’t see that cloud services create any particular new challenges. However, a successful breach of a hyperscaler yields an extreme effect due to its size and storage of aggregated information. 

Kaj: When implementing any kind of service, there are many different kinds of threats where insider actions, both mistake or intentional, are included. This is where a proper orchestration of microservices using zero trust comes into play. The integrity of a pod managing certain data is important so that it is self-contained and secure regardless of how an attack against the data is designed. One never knows the goal of the attacker, so second-guessing detailed attack scenarios is always doomed to failure. There are always unknown unknowns.  

Most cloud services are provided as unmanaged components, pieces of a bigger puzzle, regardless of whether the cloud is self-hosted or not. The engineers at a company have to create a functional workflow that creates, configures, and secures solutions based on these pieces. This is both a big risk and a safety net, since a lot of people don’t fully understand the complexity of said services and tools, and don’t understand what needs to be secured or how. That being said, those tools are built to be robust and not expose users to dangerous or even impossible configurations. 

 

What areas should organizations consider when choosing a cloud service provider? 

Daniel: One of the biggest concerns today is that cloud service providers have to adapt better to customer needs. Currently, there are a handful of providers offering a one-size-fits-all model. It is certainly a cost-effective model, but the price tag on the invoice does not tell the whole story. The legal implications when using cloud services based in countries with extraterritorial legislation will be an ongoing issue. 

Kaj: Categorization of information must take place, followed by an analysis of what requirements there are in each category. The requirements have to take both legal and security (availability, confidentiality, and correctness) aspects into account. In some cases, there is a balance between goals where the so-called risk appetite is to be decided upon. Be aware of benefits and risks, and make sure you avoid creating solutions where there are too many unknown unknowns.  

 

What are your predictions for cloud trends in the next five years?

Daniel: We will see a market with more cloud providers, from small to hyperscalers, which will provide cloud services that fit different customers. The American hyperscalers will continue to license their technology to other cloud providers. Laws and regulations related to national security will be broader and will affect both cloud providers and customers more and more. The effects of geopolitics will be worse over time and the EU will follow China and USA in being more protectionist. 

Kaj: We see more legislation, specifically in the EU, that isolates the EU from the rest of the world. This will create more borders that force us to use different solutions for different jurisdictions. What we instead need to do is harmonize the laws and regulations in different jurisdictions with each other so the market for IT-related services will not be as fragmented. We are close to a situation where we have serverless environments, with only pods managing information. Everything is orchestrated by mechanisms that understand both information and the policies applied to the information. 

 

The answers have been edited for length and clarity.

*Part of Kaj Kjellgren’s answers were contributed by his colleagues at Netnod: Mattias Ahnberg, Head of Architecture & Development; Patrik Fältström, Technical Director & Head of Security; and Christian Lindholm, Head of Sales and Marketing & Senior Product Manager

Digital Healthcare: How Is Technology Transforming Health?

In a post-pandemic world, it’s clear that the need for digital transformation in healthcare has become important. A survey done by McKinsey with 213 European physicians claims that more than 50 percent believe that telemedicine will be a significant part of a modern healthcare system. 

With the digital healthcare revolution underway in European nations, it’s evident that hospitals and clinics will need to adapt to emerging technologies and integrate digital health solutions as part of their strategy. 

But what is digital healthcare? And how is technology transforming healthcare?

In this article, we take a look at some of how hospitals and clinics are integrating health digital solutions, how digitalization is used to fill the healthcare skill gap, and why data will be a significant tool in the care industry.

 

Adoption of Digital Technologies In Hospitals

Europe has been facing unprecedented pressure within the healthcare system and the pandemic has shown that despite the improvement in quantity and quality of care, there is still a gap in efforts toward digitalization.

This is in part, due to several challenges such as bureaucracy in healthcare and the costs of technology for organizations to implement digital technologies.

However, it does not mean that digital technologies are not being experimented with and utilized at all. In fact, due to the pandemic, certain countries are accelerating their adoption of digital and telemedicine solutions to help improve the quality of healthcare services provided.

One such example is Portugal’s use of the ePatient system for centralized and real-time patient data management. ePatient allowed clinicians to monitor and communicate with their patients remotely if they were not able to be present. 

 
 

This adoption of a digital healthcare solution has made home care easier for healthcare professionals in Portugal as they can communicate with each other over the application.

 

The Skill Gap In Digital Healthcare

With digital systems in place, hospitals and medical professionals will need to learn how to fully utilize these solutions to deliver care. However, many in the workforce, especially in the nurses’ field, are still lacking the skills and proficiency to handle digital healthcare solutions and technology.

Before organizations can scale up these digital systems, the digital divide and skill gap within the workforce need to be acknowledged. The workforce that delivers care to patients needs training and support to use new systems, and to use these technologies effectively to deliver high-quality digital care. 

How should organizations overcome this skill gap challenge?

Everything starts at the top and clear strategic directions from those in leadership roles to integrate and train the workforce to embrace new technologies and skills must be the priority. It’s important, however, that when investing in new systems, it needs to be guided by the organization’s long-term visions and account for sustainability.

Beyond that, investing and supporting educational initiatives that provide a platform for the workforce to develop these skills will be essential in filling the medical skill gap. One such initiative is the NURSEED program by a Danish collective company that seeks to address the nursing shortage and skill gap in Denmark through a digital platform.

 
 

Future of Health Is Digital and Data

Putting digital solutions in place and equipping the workforce with the necessary skills will lay the foundation for a digital healthcare revolution for many organizations. The next step is to fully embrace the healthcare digital transformation and understand the role of data analytics.

In recent years, big data tools have played significant roles in healthcare decision-making. This is in part due to the pandemic, which resulted in an enormous surge of health data being available, allowing for bigger and better analytics.

How can the health industry utilize these data?

Through descriptive, predictive, and prescriptive analytics, healthcare providers will have immediate access to necessary information, and improve overall efficiency. 

 

For healthcare professionals, this would mean improved predictive modeling that can alert them of potential risks of chronic illness or even self-harm. And on a larger scale, it can even predict outbreaks.

With predictive and prescriptive analytics, organizations can expect a reduction in overall healthcare costs by reducing appointment no-shows, preventing equipment breakdowns, decreasing fraud, and even managing supply chain costs.

Bottom line: better data leads to better healthcare.

 

Technology Is Transforming Healthcare

There is no doubt that the healthcare digital revolution is underway and technology will transform the solutions and approaches in modern care. The question is now whether organizations are changing fast enough to keep up with the demands of modern healthcare.

The Top Worry In Cloud Security for 2021

The cloud is an environment full of potential. It provides easy access to technologies that simply weren’t available a decade ago. You can now launch the equivalent of an entire data center with a single command.

Scaling to meet the demands of millions of customers can be entirely automated. Advanced machine learning analysis is as simple as one API call. This has allowed teams to speed up innovation and focus almost exclusively on delivering business value.

But it’s not all unicorns and rainbows.

The assumption was that alongside this increased potential, the security challenges we see on-premises would grow as well. Teams should be struggling with zero days, vulnerability chains, and shadow IT.

It turns out they aren’t. At least those issues are nowhere near the top of their list of concerns. The top security challenge for builders in the cloud is very straightforward.

Their biggest challenge is making mistakes in the form of service misconfigurations.

 

Shared Responsibility

First, let’s look at the evidence around the initial assumption that people make about cloud security. They assume the cloud service providers themselves are a big risk. The data doesn’t support this at all.

Each of the big four cloud service providers; Alibaba Cloud, AWS, Google Cloud, and Microsoft Azure, have had two security breaches in their services over the past five years…combined. Now, before we get into each of these, it’s important to note that each of the big four has had to deal with tons of security vulnerabilities over this timeframe.

A large number of cloud services are simplified managed service offerings of popular commercial or open-source projects. These projects have had various security issues that the providers have had to deal with.

The advantage for us as users, and builders, is how operations work in the cloud. All operational work done in any cloud follows the Shared Responsibility Model. It’s very straightforward.

There are six primary areas where daily operational work is required. Depending on the type of service you are using in the cloud, your responsibilities shift. If you’re using instances or virtual machines, you are responsible for the operating system, the applications running on that OS, and your data. As you move to an entirely managed service, you are only responsible for the data you process and store with the service.

For all types of cloud services, you are responsible for service configuration. despite having a clear line of responsibilities, the providers offer many features to help you meet your responsibilities and adjust the services to suit your needs.

 

Cloud Service Provider Issues

Now, let us take a look at providers’ security issues over the past five years… the first one is from March 2020. In this case, Google Cloud paid out a $100,000 reward through their bug bounty program to a security researcher who found a privilege escalation issue in Google Cloud Shell.

This is a service that provides a browser-based interface to the command line of a virtual machine running in your account. Under the covers, this shell is simply a container running an application to provide the required access. The researcher noticed that they were able to use a socket connection in the container to compromise the host machine and escalate their access.

The root cause? A misconfiguration in the access to that socket.

The second example is from January 2020 and it involved a service offered in Microsoft Azure. Here an issue was reported in the Microsoft App Service offering. This vulnerability allowed an attacker to escape the expected boundaries of the service and access a limited-scope deployment server with elevated privileges.

The reason? A misconfiguration in the open-source tool used to provide this web app hosting service.

In both cases, the vulnerabilities were responsibly disclosed and quickly fixed. Neither issues lead to any reported customer impacts. Both of these cases were in higher-level cloud services. These are services that the provider’s teams built using other services on the platform. As a result, and in line with the shared responsibility model, they were at risk of a service misconfiguration.

Even hyper-scale providers face this challenge!

 

3rd Party Validation

There’s more evidence to support the fact that misconfigurations are the biggest issue in cloud security. Security researchers in the community who study cloud issues have all published findings that align with this premise. Whether from other security vendors or industry organizations, the findings agree: that 65-70% of all security issues in the cloud start with a misconfiguration.

Making it worse, 45% of organizations believe that privacy and security challenges are a barrier to cloud adoption.

Why is that worse?

When understood, the shared responsibility model makes it easier to maintain a strong security posture. Organizations should be pushing to move faster to the cloud to improve their security!

 

Direct evidence

However, surveys and targeted research projects only go so far. What does the publicly available evidence say? Here’s a list of some of the most visible cloud security breaches in recent years;

 
 

If you filter out all the reports of cloud hacks and breaches to remove incidents that were not cloud-specific—so those where the issue wasn’t related to the cloud, the service just happened to be there—over two billion sensitive records have been exposed through a breach in cloud security.

Let’s take this further and remove every single breach that wasn’t due to a single misconfiguration.

Yes, single. One wrong setting. One incorrect permission. One simple mistake…caused all of these breaches.

That leaves the Capital One breach. This more complicated event was caused by …two misconfigurations and a bug. An in-depth analysis of this breach shows that the bug was inconsequential to the overall impact which was 100 million customer records being exposed.

What’s more, is that Capital One is a very mature cloud user. They are a reference customer for AWS, they’ve been a huge advocate of the cloud within the community and were the incubator for the very popular open-source security, governance, and management tool, Cloud Custodian.

This is a team that knows what they are doing. And yet, they still made a mistake.

 

Pace of Change

That’s really what misconfigurations are. They are mistakes. Sometimes those mistakes are oversights, and other times an incorrect choices made due to a lack of awareness.

It all comes back to the power made accessible by the cloud. Reducing these barriers has had a commensurate increase in the pace of innovation. Teams are moving faster. As these teams mature, they can maintain a high rate of innovation with a low failure rate.

In fact, 43% of teams who have adopted a DevOps philosophy can deploy at least once a week while maintaining a failure rate of under 15%.

Critically, when they do encounter a failure, they can resolve it within the day…more impressively 46% of those teams resolve those issues within the hour. But, as we know, cybercriminals don’t need a day. Any opening can be enough to gain a foothold creating an incident.

What about teams that aren’t at this pace? Well, the other 57% of teams, the majority of which are large enterprises, often feel that their lack of pace provides protection. Moving cautiously in the cloud allows them to take a more measured approach and reduce their error rates.

While this may be true—and there’s no evidence to support or disprove this assumption—change is still happening around them. The cloud service providers themselves are moving at a rapid pace.

In 2020, the big four hyper-scale providers released over 5,000 new features for their services. For single cloud users, that means almost 2 new features a day…at a minimum. For the growing set of multi-cloud users, the pace of change only increases. So even if your team is moving slowly, the ground underneath them is shifting rapidly.

 

Goal of cybersecurity

Now the goal of cybersecurity is actually quite simple. The goal is to ensure that whatever is built works as intended and only as intended. In a traditional on-premises environment, this standard approach is a strong perimeter and deep visibility across the enterprise.

That doesn’t work in the cloud. The pace of change is too rapid, both internally and with the provider. Smaller teams are building more and more. Quite often, by design, these teams act outside of the central CIO infrastructure.

This requires that security is treated as another aspect of building well. It cannot be treated as a stand-alone activity. This sounds like a monumental task, but it’s not. It starts with two key questions;

  1. What else can this do?
  2. Are you sure?

This container running the code creates the financial reports. What else can it do? Can it access other types of data? Are you even sure it’s the right container?

This is where security controls provide the most value.

 

Top pain points to address

Most of the time when we talk about security controls, we talk about what they stop. Using an intrusion prevention system can stop worms and other types of network attacks. Anti-malware controls can stop ransomware, crypto miners, and other malicious behaviors.

For every security control, we have a list of things it stops. This is excellent and works well with subject matter experts…a.k.a the security team.

Builders have a different perspective. Builders want to build. When framed in the proper context, it’s easy to show how security controls can help them build better.

Posture management helps ensure that settings stay set regardless of how many times a team deploys during the week. Network controls assure teams that only valid traffic ever reaches their code. Container admission control makes sure that the right container is deployed at the right time.

Security controls do so much more than just stop things from happening. They provide answers to critical questions that builders are starting to ask.

What else can this do?”. Very little thanks to these security controls.

Are you sure?” Yes. I have these controls in place to make sure.

When built well and deployed intelligently, security controls help teams deliver more dependable, easier-to-observe, and more reliable solutions.

Security helps you build better.

Himadri Majumdar: How to Become a Global Industry Leader with Quantum Computing

Digital transformation is advancing at lightning speed. In a perfect world, we would test out every available emerging technology, but in real life, this is impossible due to the required time and budget constraints. Therefore, CIOs must identify and invest in the right IT technologies that will benefit their organizations the most. 

Quantum computing is at the forefront of IT technologies, presenting today’s CIOs with solutions for IT preparedness, cyber resilience and business continuity. We speak with Himadri Majumdar, Program Manager, Quantum at VTT on quantum computing and why it is essential that IT leaders pilot this technology as soon as possible.  

 

Investments in new technologies and digital tools are crucial for business continuity. Why should organizations invest in quantum computing?

It is imperative that companies try out quantum computing as soon as possible. The world is moving forward fast, making it important to see and adopt the benefits of quantum computing to stay ahead of the competition. 

Luckily, organizations actually do not need to invest in quantum computing to try out or gain the initial benefits it enables. As quantum computer procurement is a significant investment, it is wise to leverage other methods of access to quantum computers rather than building or buying one. There are multiple providers of access and services of quantum computers in the cloud. IBM is one of the biggest and earliest players. 

The smartest thing to do is to pilot the available services and evaluate whether quantum computing could be beneficial for your business according to the following guidelines:  

  • Only make decisions once you see a clear business benefit. The investment will depend on the magnitude of the benefit. The bigger the benefit, the better the investment should be. 
  • If you decide that the benefits are so great that you would like to buy or build a quantum computer, there are companies that provide customized, problem-specific quantum computers.  
  • If the benefits are good but not that big then continuing with quantum computers in the cloud might still be a good option. In that case, you also do not need to hire or build a large company quantum computing team. Companies can leverage the service provided by consultant companies who can deliver solutions customized to your business needs.  

Any model that works best for your company is ideal.  

 

Which industries will benefit from quantum computing the most? 

In simple words, decision-making in any business is based on the compromise of a huge number of, often conflicting, choices or parameters. Therefore, industries that have optimization-related aspects playing an important role in their business will need quantum computing. This can be related to process optimization, logistics optimization, and data optimization, among others. 

For example, if you are in the logistics business, in-time delivery might depend on parameters such as in-time delivery of goods from a partner, availability of fleet, choices of drivers, weather conditions, and real-time traffic towards the destination. When multiple parameters are considered, more accurate predictions can be made.  

However, computing various options with many parameters utilizing classical computers will take a long time – hours or even days. This often results in businesses making compromises by considering fewer parameters.

This can be illustrated by going back to the logistics example: businesses can compromise by choosing to ignore data on real-time traffic. The worst-case scenario of omitting real-time traffic is delays in delivery and poor customer experience.  

For more accurate predictions based on as many parameters as possible, we need computing that enables faster optimization. This is why quantum computers are critical

 

In 2020, VTT launched an ambitious three-phase project to acquire Finland’s first quantum computer. What are you most excited about leading this project? What progress has VTT made so far? 

I am excited for many reasons. Firstly, I am very excited that we are able to build almost the whole computer indigenously.  

Quantum technology is so strong in Finland that we do not need to rely on significant parts and components from elsewhere to build the machine. Companies like Bluefors and IQM are big domestic players with a strong global presence and acceptance. They have successfully capitalized on the deep low-temperature physics expertise and technology developed in Finland since the 1960s and are now leading the field. Therefore, we can be very proud that we in Finland invested in this technology so well and so early that we are now in the perfect position to reap the early benefits and lead quantum computing globally

Secondly, I am excited about the possibilities that Finnish companies will have. Companies that will be users of quantum computers will be able to find world-leading solutions close to home. They can become global leaders in their respective fields by leveraging quantum computing.  

There are so many other excellent reasons too. We are on track for the first phase of building the quantum computer in Espoo, Finland. We expect to demonstrate the 5-qubit quantum computer by end of 2021. We will then continue building phases 2 and 3 with 20 and 50 qubit computers respectively. We are also making excellent progress in the R&D front which will help us make quantum computers more integrated and cheaper in the future.  

At VTT we now have a dedicated quantum algorithm team comprising experts in quantum theory, mathematics, and AI. The team is ready to help companies see the benefits of quantum computing in their businesses. 

 

Organizations are more vulnerable to cyber-attacks than ever before with the rise of digitalization. What is your advice on building a resilient and scalable cybersecurity system? 

Indeed. Cybersecurity is one of the biggest threats for businesses in this decade. We already witness the risks in the U.S., where the vulnerability of even traditional businesses, like oil and gas, are exposed through ransomware attacks. So, we need to be prepared.  

Quantum computing and quantum communication add another dimension to cybersecurity. Quantum communication is an emerging topic that will be the mode of quantum-safe (tele)communication protocols based on things like quantum key distribution (QKD). It needs to be understood that quantum computers are amazing codebreakers. Once there are affordable and fully deployed quantum computers in the market, malicious players will take advantage of them to break the current cryptography protocols like RSA. We also must be prepared for that. Europe and more specifically, Finland, is also at the initial stages of making its communication infrastructure quantum-safe. Currently, available QKD solutions involve dedicated hardware in special-purpose networks, but in the long term we will need to improve safety protocols for communications more generally.  

Apart from the quantum communications hardware I mentioned above, we also have to be ready from a software perspective. We have to update or replace the classical software with new quantum-resistant algorithms, that will be unbreakable with quantum computers. This software is what we call post-quantum cryptography. Finland is already running a big national project on that topic. We are getting prepared with cryptographic and cybersecurity codes that will protect us from attacks made with quantum computers. 

This is a two-pronged approach where we use quantum communications to our advantage to strengthen cybersecurity and create solutions that keep organizations secure from attacks by malicious quantum computers. 

 

How do you expect quantum computing and post-quantum cryptography to affect IT trends in 2021 and 2022? 

The National Institute of Standards and Technology (NIST) in the U.S. is leading the effort globally. The goal of post-quantum cryptography, also known as quantum-resistant cryptography, is to develop cryptographic systems that are secure against both quantum and classical computers and can interoperate with existing communications protocols and networks. Almost 70 potential candidates have been narrowed down to seven in 2020. In 2021, the winner(s) will be declared, and it will become the chosen platform for future post-quantum cryptography.  

Efforts in 2021 and 2022 will be dedicated to the identification and understanding of new standards and how they can be implemented. Following that, the implementation phase will begin. Time is of the essence as quantum computation, the potential threat that makes post-quantum cryptography relevant is making progress very fast. Preparedness for the future needs to start early enough for companies to have business continuity in the post-quantum era. 

 

Today’s CIO no is longer just a manager of the IT department. How has the IT leader’s role transformed since the pandemic? 

I agree. The CIO’s office is now both the first line of defence for a company’s IT department and solution provider for companies’ current and future ICT needs. During the pandemic, the CIO’s office went into overdrive to create IT solutions that could enable maintaining the companies’ business in remote settings of employees without compromising security.  

Finding solutions for remote work placed a lot of pressure on IT teams, that they had to, very unwillingly, make some security compromises over business continuity. The pandemic was an unforeseen, unfortunate event and not every business was prepared for it

The IT security vulnerability caused by this sudden change has left many companies susceptible to ransomware attacks. We will probably learn in the future the extent of this during the pandemic, but it is not hard to imagine the magnitude of it.  

Therefore, the CIO’s office should also look into future opportunities and threats like quantum computing and communications. This could be a strong aspect of their IT preparedness for the future. If the situation demands, they will not need to make any security compromises. In that respect post-quantum cryptography is one topic that CIOs of companies should start paying attention to. 

 

 *The answers have been edited for length and clarity. 

Smart Manufacturing: Die Zukunft der Industrie?

Johann Hofmann, Founder von ValueFactoring® bei MR Maschinenfabrik Reinhausen GmbH zählt mit über 30 Jahren Berufserfahrung als Experte im Bereich Industrie 4.0 und Digitalisierung. Schon zu Beginn seiner Karriere hat er für sein Unternehmen den digitalen Weg bereitet und ein Assistenzsystem für die digitale Hochleistungsfertigung entwickelt, für das er mit dem Industrie 4.0 Award ausgezeichnet wurde. 

Der charismatische Maschineningenieur hilft heute EntscheidungsträgerInnen dabei, Sicherheit und Know-how rund um Digitalisierung und Industrie 4.0 zu erhalten. In diesem Interview zeigt er uns auf, wie der Stand im Smart Manufacturing heute in Deutschland ist und wo es noch Verbesserungsbedarf gibt. 

 

New Normal in der Industrie 4.0

Wie hat der aktuelle Normalzustand, das „New Normal“, die Aussichten für Industrie 4.0 verändert? Sind die Hersteller auf dem richtigen Weg in Bezug auf die nötige Digitalisierung?

 

Es geht um Hersteller von vernetzungsfähigen Produkten. Die Werkzeugmaschine oder der Werkzeugschrank ist zum Beispiel ein vernetzungsfähiges Produkt. Diese Produkte sind auch als Assets im Sinne von Industrie 4.0 bekannt. Und ebendiese Assets müssen zu Industrie 4.0 Komponenten werden, damit das Ganze funktioniert.

Dazu braucht jedes Asset eine Verwaltungsschale. Die Verwaltungsschale ist im Prinzip Teil des Digital Twins (Digitaler Zwilling). Damit wird die Vernetzung vereinfacht, so dass wir von diesem „Gefrickel“ wegkommen, das beim Vernetzen von Dingen oftmals entsteht. 

Beim Drucker hat das ja wunderbar funktioniert. Wenn wir heute unter Windows 10 einen neuen Drucker installieren, steckt man den ein und der installiert sich von selbst. So etwas stelle ich mir auch bei Werkzeugmaschinen vor. Doch dazu müssen die Hersteller von diesen vernetzungsfähigen Produkten eine Industrie 4.0 Komponente ausliefern, also nicht nur das Produkt selbst, sondern auch den digitalen Zwilling, wie die Verwaltungsschale. Dann sind sie auf dem richtigen Weg.

 

Veränderungen durch das New Normal

Hat das New Normal, dieser neue Normalzustand, in den letzten Monaten viel verändert oder war das von Anfang an Thematik?

 

Worüber ich eben gesprochen habe, ist nicht die Wirklichkeit, sondern eine Wunschvorstellung. Das muss sich erst Schritt für Schritt über alle Asset-Hersteller ergeben. Die Hersteller von vernetzungsfähigen Produkten halten sich aktuell zurück, weil es noch keine Software gibt, die damit umgehen kann. Gleichzeitig halten sich die Software-Hersteller zurück, weil es noch keine vernetzungsfähigen Produkte gibt, die eine Verwaltungsschale mitbringen. 

Da muss die Industrie 4.0 Plattform noch richtig Druck auf die gesamten Hersteller ausüben und ein Regelwerk kreieren, damit das in Gang kommt. Dieser „Normalzustand“ ist also noch nicht da, ist aber ein „gewünschter“ Normalzustand irgendwann in der Zukunft.

 

Wie fortgeschritten ist dieser Plan denn schon? Kannst du uns eine Einschätzung geben?

 

Das ist branchenabhängig. Unterschiedliche Branchen sind also unterschiedlich weit. Und ich habe das Gefühl, dass die Branche, in der ich unterwegs bin, die diskrete Fertigung, am weitesten hinterher hinkt. Aber das ist nur eine persönliche Einschätzung (lacht).

 
Wünschen Sie sich weitere aufschlussreiche Diskussionen? Treffen Sie Johann Hofmann und andere BranchenexpertInnen auf unserer nächsten Veranstaltung IndustryForum Smart Manufacturing.
 

Zukunftsausblick: Augmented Reality, Wearables und Cloud

Gibt es andere aufstrebende Technologien, die Hersteller im Auge behalten sollten? Welche Herausforderung gibt es bei der Einführung solcher Technologien?

 

Da fallen mir auf einen Schlag drei Technologien ein. Zum einen wären das die Datenbrillen, also Augmented Reality über Datenbrillen. Diese Technologie kränkelt aber ein bisschen. Wir hatten vor Jahren schon mal einen Versuchsballon zum Umrüsten von Maschinen über eine digitale Brille gestartet, bei der man den Arbeitsraum dann digital sieht. Das lief jedoch noch nicht ganz reibungslos, deshalb wurde das Projekt erstmal auf Eis gelegt. Trotzdem habe ich dahingehend eine große Erwartungshaltung. Mit jeder neuen Version einer Datenbrille wird es sicherlich besser und umfangreicher. 

Überholt werden könnte diese Technologie von Wearables, also am Körper tragbare Computer, wie beispielsweise eine Smart Watch oder Datenhandschuhe. Hier wird sicherlich eine Zeit kommen, bei der der Meister durch eine Fertigungshalle läuft, dabei auf seiner digitalen Uhr Informationen von einer Werkzeugmaschine bekommt, die gerade irgendein Problem hat. Das wird bestimmt Standard in der Zukunft werden. 

Die dritte Technologie ist die Cloud. Vor Jahren war ich ein größerer Fan der Cloud. Das hat jetzt ein bisschen abgeflacht, weil all unsere Kunden in der Fertigungsbranche Angst vor der Cloud haben. Doch das Thema sollte unbedingt weiterverfolgt werden. Es gibt Branchen, die sind komplett in der Cloud, zum Beispiel Amazon. Aber genau in unserer Branche hinkt man da ein bisschen hinterher, wir müssen aber am Ball bleiben.

 

Die Wichtigkeit von Digital Twins im Smart Manufacturing

Der Digital Twin ist bei den Herstellern immer beliebter geworden um mit Start-Ups wie Tesla Schritt zu halten. Wie sollten ältere, alteingesessene Fabriken die Einführung eines Digital Twins angehen?

 

Aus der Sicht des Anwenders ist die Frage einfach zu beantworten. Wenn eine Fabrik etwa eine neue Maschine, einen Werkzeugschrank oder ein anderes Produkt bestellt, sollte sie vom Hersteller einen Digital Twin fordern. Es soll also nicht nur die Maschine, sondern auch der digitale Zwilling geliefert werden. 

Und das kann ich vom Hersteller verlangen, da ich als Käufer auch eine gewisse Macht habe. Wenn ich diese Macht nutze, dann werden alle Hersteller mit der Zeit ganz sachte dazu gezwungen, den digitalen Zwilling zu liefern. Und daran scheitert es ja momentan. Dazu fällt mir ein Beispiel ein: 

Vor ca. 15 Jahren, als ich die Werkzeugdatenbanken aufgebaut habe, hatten wir kein Bild bzw. eine Grafik dazu. Wir mussten nebenbei Studenten beschäftigen, die uns die Grafiken für die Werkzeuge gezeichnet haben. Damals habe ich unseren Einkäufer also darum gebeten, bei jedem Kauf in den SAP Datensatz reinzuschreiben, dass das Werkzeug UND eine Grafik geliefert werden müssen. Wenn eines davon nicht geliefert werden sollte, zahlen wir auch nicht. 

Natürlich war das damals in der Werkzeugbranche erst einmal ein Riesenaufschrei, ist heute aber Standard. Wer heute ein Werkzeug kauft, bekommt selbstredend eine Grafik dazu. Jeder der also Produkte kauft, kann beim Hersteller Druck aufbauen. Jetzt jedoch müssen wir nicht nach der Grafik verlangen, sondern nach dem Digital Twin. Wenn das jeder fordert, kann man sich den Zugzwang ausmalen, der plötzlich bei den Herstellern herrschen würde.

 

Das fehlt also noch. Aber ist es auch umsetzbar und realistisch?

 

(Lacht) Wir müssen das halt erst einmal probieren. Es gibt ja viele Hersteller, die bereits einen Digitalen Twin liefern, aber teilweise noch nicht vollständig. Die Industrie ist da auf jeden Fall auf einem guten Weg, aber eben noch nicht am Ziel. Um dieses Ziel jedoch schneller zu erreichen, können Einkäufer diesen Druck bei den Herstellern aufbauen.

 

Die fünf Naturgesetze der Digitalisierung

Was sind die größten Herausforderungen für Fabriken bei der Anwendung von Digital Twins als Teil ihrer bestehenden Prozesse? Was können Hersteller tun, um diese Herausforderungen zu meistern?

 

Ich beschäftige mich mittlerweile seit 33 Jahren mit der Digitalisierung. Dabei haben sich immer die gleichen Herausforderungen herauskristallisiert. Wenn ein Projekt gekränkelt hat oder gescheitert ist, war das immer eine von fünf Herausforderungen, die nicht vernünftig angepackt worden ist. Diese fünf Herausforderungen nenne ich die „fünf Naturgesetze der Digitalisierung“. Diese muss man abarbeiten, damit solche Projekte auch gelingen können. 

Das erste Naturgesetz ist: Menschen mitnehmen. Klingt erst einmal ganz banal, aber wenn die MitarbeiterInnen nicht wollen, wird das Projekt immer scheitern. Man muss Menschen also von dem Projekt begeistern. Da gibt es eine Metapher, die man vor allem auf Konferenzen häufig hört: Wenn du ein Schiff bauen willst, erzähl den Leuten nicht, was sie alles für den Bau benötigen. Erzähle ihnen nur von der Schönheit des Meeres. Dann wirst du auch das beste Boot bekommen, das du dir vorstellen kannst. 

Das zweite Naturgesetz ist: Wenn man einen schlechten analogen Prozess digitalisiert, dann bekommt man einen noch schlechteren digitalen Prozess. Analoge Prozesse müssen also schon vorher schlank und einfach gemacht werden. Dazu eignen sich LEAN-Methoden hervorragend. Also LEAN einführen und leben, das ist das zweite Naturgesetz der Digitalisierung und somit auch eine Herausforderung.

Das dritte Naturgesetz sind Stammdaten. Viele Projekte scheitern an unvollständigen Stammdaten. Diese müssen vollständig und fehlerfrei sein. Wenn du beispielsweise mit einem Navigationssystem durch Deutschland fährst, dann sind die Landkarten die Stammdaten. Wenn ich nun in ein neues Industriegebiet fahre, und mein Navi diese neue Landkarte noch nicht kennt, dann fehlen mir hier die Stammdaten. 

Das vierte Naturgesetz ist: Die Konnektivität im Brown-Field herstellen. Wir leben und arbeiten ja alle in einem Brown-Field. Green-Field wäre eine nagelneue Fabrik mit nagelneuen Maschinen und Werkzeugen. Doch das hat keiner. Wir alle haben einen historisch gewachsenen Maschinenpark. Wir haben also einen maschinellen Zoo an unterschiedlichen Maschinen, die 24 Stunden am Tag und 365 Tage im Jahr stabil vernetzt sein müssen. Wenn ich diese Konnektivität nicht erreichen kann, kann ich alles andere auch nicht schaffen. 

Das fünfte Naturgesetz ist: Offene Ökosysteme. In unserer Branche soll man nicht nach einer eierlegenden Wollmilchsau suchen, denn die ist Illusion. Die Lösung sieht so aus: Die digitale Fabrik besteht aus verschiedenen, eigenständigen Öko-Systemen, in deren Zentrum ein planendes META – System sitzt, Stand heute ist das das ERP System. Unterhalb dieser META Ebene befinden sich die jeweiligen ÖKO Systeme mit ihren domänenspezifischen Abläufen und Prozessen. Wie z.B. eine Feinsteuerung, ein Warehouse System, ein CAQ System, ein PLM System, ein Shopfloor System, etc. Die Interoperabilität dieser Systeme ist dabei ein entscheidendes Kriterium für deren Auswahl. Die richtige Orchestrierung der einzelnen Ökosysteme bringt den echten Mehrwert der Digitalisierung.

Das sind die fünf Herausforderungen, die man erst einmal meistern muss, bevor man überhaupt mit der KI den nächsten großen Sprung machen kann. Ohne das bringt auch ein Digital Twin nichts. Wenn es dafür keine Basis gibt, dann ist der Digital Twin auf verlorenem Posten.

 

Nachhaltigkeit in der Industrie 4.0

Aufgrund von Megatrends, wie dem Klimawandel, ist die Nachhaltigkeit für die Infrastruktur von größter Bedeutung. Welche wichtigen Änderungen nehmen Hersteller vor, um mit den Nachhaltigkeits-Vorschriften Schritt zu halten?

 

(Lacht) Jetzt hast du ein Riesenthema aufgemacht. Was heißt denn Nachhaltigkeit überhaupt? „Die natürliche Regenerationsfähigkeit der beteiligten Systeme gewährleisten“. Am Beispiel der Forstwirtschaft heißt das etwa, nicht mehr Holz zu fällen als nachwachsen kann. Schneide ich einen Baum ab, muss ich einen Baum anpflanzen, der auch irgendwann abgeholzt werden kann. Das ist Nachhaltigkeit. 

Aber was bedeutet das für uns in der Produktion? In den Produktionshallen würde das bedeuten, dass nicht mehr Rohstoffe verbraucht werden als Rohstoffe nachwachsen können, wie beispielsweise seltene Erden, Erdöl oder Eisenerz. Das geht aber nicht. Diese Aussage („geht nicht“) verwende ich äußerst selten. In diesem Fall hat das allerdings Millionen an Jahren gebraucht, um überhaupt zu entstehen. Die ganze Nachhaltigkeits-Thematik ist in der Fertigungsbranche also eher scheinheilig, weil es sowieso nicht geht. 

Wir beuten die Erde aus, bis sie kollabiert. Punkt. Das ist das Problem, das wir alle auf diesem Planeten haben. Was sollen die Hersteller jetzt also machen, um einigermaßen etwas richtig zu machen? 

Sie könnten in energieeffiziente Anlagen investieren, damit auch der CO2-Ausstoß reduziert wird. Auch könnten Reisezeiten minimiert werden. Durch die Coronavirus-Pandemie habe ich so viel Reisezeit eingespart, das kann man auch nach Corona noch beibehalten. Zudem sollte Home-Office – soweit möglich – auch nach der Pandemie ermöglicht werden. So wird jeden Tag der Weg ins Büro gespart, und damit auch Sprit. 

Ein Beispiel: Manche fertigen ein goldenes Lenkrad mit Lenkradheizung, während andere einen Tesla bauen. Nicht das goldene Lenkrad sollte optimiert werden, sondern es sollte an den richtigen Stellen weiterentwickelt werden. Aber ob das Elektroauto für den Klimawandel wirklich so gut ist, ist eine völlig andere Diskussion. 

Eine weitere Methode, die bei der Nachhaltigkeit helfen könnte, wäre die Just-in-Time-Produktion. In meinem Studium im Jahr 1989 wurde das sehr stark thematisiert. Hier heißt es: Das beste Lager ist kein Lager. Der beste Transport ist kein Transport. Es könnte also direkt in die Montage produziert werden, statt erst einmal ins Lager und von dort aus wieder weiter transportiert zu werden. Das ist zwar eine uralte Methode, aber die könnte durchaus nachhaltig sein.

 

Ist das eine realistische Methode? Setzen dies Hersteller vielleicht sogar schon um?

 

Die Automobilindustrie setzt das großartig um. Sitzhersteller zum Beispiel fahren ihre Sitze in kein Lager, sondern direkt an die Montagelinie im Autowerk und vor dort werden sie Just-In-Time montiert. So gut, wie das die Autoindustrie hinbekommt, bekommt das keine andere Branche hin. Da gibt es für unsere Branche noch viel Optimierungs-Potenzial.

 

Richtig auf die Zukunft vorbereiten

Inwiefern können Zukunftsszenarien EntscheidungsträgerInnen helfen, ihre Pläne zu verfeinern und Strategien zu entwickeln, um sich auf die Zukunft vorzubereiten?

 

Zukunftsszenarien helfen EntscheiderInnen immer nur dann, wenn sie die Zukunft auch richtig vorhergesagt haben. Sonst kann es sein, dass Pläne entstehen, die das Ganze nicht verfeinert, sondern ruiniert haben. Wenn ich von einem komplett falschen Szenario ausgehe, dann treffe ich ja die komplett falschen Entscheidungen. 

Dazu fällt mir immer der Komiker Karl Valentin ein. Er hat folgendes gesagt: „Prognosen sind schwierig, besonders wenn sie die Zukunft betreffen“ (lacht). 

Aber bei Zukunftsfragen kann ich trotzdem eine Hilfestellung geben, wie beispielsweise die fünf Naturgesetze der Digitalisierung. Diese haben sich über Jahrzehnte als Basis für richtige Entscheidungen bewährt und haben auch noch in der Zukunft Gültigkeit. Wenn man sich daran orientiert, kann man eigentlich keinen großen Fehler machen.  Wenn Sie diesen QR-Code entschlüsseln, dann finden Sie dazu eine wertvolle Hilfestellung, in der meine Erkenntnisse aus 33 Jahren Digitalisierung komprimiert dargestellt werden: 

Understanding The Tech Challenges of Retail Giants

With more and more people embracing digital and smart shopping experiences, the retail market is scrambling to adopt new retail technology to remain viable and sustain growth in a rapidly changing landscape.

In this article, we’ll highlight some of the major challenges businesses are facing and the solutions they are looking for. For a more in-depth look at the trends of the retail industry, head over to our Retail Investment 2021 report.

 

Challenge 1: Evolving and Enhancing CX

 

Customer experience is expected to shift even more in 2021 and as consumers become more conscious of their spending, retailers will need to optimize every step of the customer journey to maintain loyalty, and spark growth.

Improving customer journey optimization will involve significant investments in retail technology trends, a key touchpoint of which will be the tech that improves process efficiency such as AI, automation, and customer touchpoints (as well as mapping them out).

A quick view at the core focuses among retail leaders shows that many organizations are prioritizing smart solutions and digital competency to handle customer needs and ensure quality CX.

 
 

What Are They Looking For

 

Improving the experience for customers by delivering fast and accurate responses through CX software which integrates marketing automation, customer service, CRM, CPQ, sales force automation (SFA) solutions, and customer data platform (CDP)

 

Challenge 2: Deciphering The Data

 

Achieving effective customer journey optimization will require targeted investments in retail technology and a high priority tech among retail leaders is data and analytics.

With the influx of data available due to rapid digital transformation, organizations are scrambling to adopt big data and real-time data analytics to better refine their business actions according to customers’ needs and profiles.

As the global big data market is forecasted to be worth $103 billion by 2027, data analytics is no longer just a buzzword, but an important retail technology investment needed for day-to-day efficiency in organizations and individuals.

Given the current talent gap, however, businesses will still look to third-party solutions in terms of building an infrastructure that allows them to utilize data analytics effectively.

 

What Are They Looking For

 

Platforms that implement easy-to-use analytics, data mining, and automated forecasting. Department-specific data such as marketing, sales, and customer analytics will also be a key factor for many businesses.

 

Challenge 3: Digitalizing Stores and Scaling e-Commerce

 

A shift towards improved digital storefront experiences is in line with customer market behavior as globally, 49% of the population is shopping online more now compared to pre-COVID times.

Nevertheless, customers still prefer shopping on-location, with a recent survey done by Shekel showing that 87% of customers prefer to shop in stores, but with touchless or seamless self-checkouts.

As such, improving the infrastructure for businesses’ e-commerce platforms and brick-and-mortar stores has become a race. Those who are able to achieve seamless online shopping experiences and frictionless smart payments will get the lion’s share of the market.

 

What Are They Looking For

 

The ability to transition from an analog business model to a digital, omnichannel model through cloud solutions or optimizing current digital channels such as mobile apps, IoT, and smart shopping.

 

Challenge 4: Improving Digital Security

 

Machine learning and cloud computing continue to be high priorities in tech adoption for retail leaders. Cybersecurity, however, has seen a significant rise due to demands for safer and more secure digital/smart shopping.

The confusion caused by the coronavirus and the massive shift towards digital/remote working has led to cyberattacks becoming frequent with large data breaches increased by 273% in the first quarter of 2020.

 
 

Retailers will face an uphill battle in the “new normal” of post-COVID to assimilate all the necessary digital security strategies, be it upgrading vulnerable software and hardware components or strengthening customer data protection, to ensure customer confidence and loyalty.

However, with the global market for cyber security software expected to grow to $230 billion in 2021, they can expect exponential growth in the practices and solutions for digital security.

 

What Are They Looking For

 

A simplified platform that allows them to reduce security risk through robust privileged access management (PAM) and optimal solutions for customer data storage and protection that comply with GDPR.

 

Overcoming The Challenges

 

At the start of 2021, it’s clear that retail giants are making big investments when it comes to innovative retail technologies. Certain technologies, such as digital transformations, continue to be a major priority for retailers.

The big changes, however, come from renewed interest in improving customer journeys through data analytics and scaling up digital channels via e-commerce or smart shopping experiences.

For any organization, it’s essential to identify which areas of retail technology they are trailing behind, then network with the right solution provider, invest in skilled talents and have the necessary tools to maintain growth in a soon-to-be revitalized industry.

CIO Investments: Which Tech Is Your Priority?

As the world crosses into 2021, the distribution of the COVID-19 vaccine has brought surges in global stocks and market optimism.

However, even with great hopes of economic recovery by the end of 2021, organizations still need to ensure that their business growth and plans continue positively. Chief Information Officers (CIOs) are playing a big part in achieving these goals by maximizing information technology (IT) investments and advancements.

 

What IT Investments To Focus On?

 

According to our Executive Trend Survey, 67% of CIOs placed data science as a top priority for 2021 with core focuses on analytics strategy, data management, and big data analytics

Meanwhile, cyber security and cloud were named as other top CIO priorities by 59% and 53% of surveyed leaders respectively.

 
 

But what does this mean for CIOs across the industries?

Based on feedback from CIOs and key IT executives, the majority (47%) of them are facing 2021 with slight changes in their goals and a lower budget for their function.

 
 

With limited budgets, CIOs need to pick and choose which goal takes priority over the others and select a solution that will truly give them the return on investment they seek.

Thus, even if CIO trends point towards analytics if their current end objectives don’t correspond with the need for data solutions, they should focus on more pressing investments.

Another key factor influencing their investment priorities lies in the current maturity levels of their technology and operations. For instance, some are still new in forming data strategies while others are more advanced in their data-driven processes, thus their focus areas in the use of data science differ greatly.

 

Investing In Data Science

 

Today, it’s uncommon to find any company that is not taking advantage of their data. From enhancing customer experience to improving predictive maintenance, business leaders are aware that data is critical to their organizational growth.

But which area of data analytics should your organization focus on? Between the different analytics applications and components, what should be the foremost priority?

In recent interviews with CIOs and other IT decision-makers, over 450 of them named analytics as their core focus. Even so, under the analytics umbrella, their interests ranged from big data analytics and predictive analytics to data warehousing and analytics strategy.

 
 

55% of them selected data management as their foremost investment in analytics, naming master data management (MDM) and product information management (PIM) implementation as some of their projects.

 
 

The MDM solution is largely adopted by the banking, financial services and insurance (BFSI) sector to manage massive amounts of transactional data on their customers. PIM, on the other hand, is seeing higher demand by the e-commerce industry and an anticipated fast growth in the media and entertainment sector.

In regards to data analytics strategy, some of the CIOs are investigating how they can make the business work more efficiently through analytics strategy while others are taking the next steps to improve data quality.

On the other hand, a number of the interviewed decision-makers are still setting up and realizing their data strategy, indicating that they’re still in the planning stages and concentrating on becoming a data-driven organization.

 

Investing in Cyber Security

 

Meanwhile, our most recent interviews with CIOs on cybersecurity investments discovered that cloud security is foremost on their priority list followed closely by cyber security strategy.

 
 

From our findings, a number of the interviewed decision-makers expressed interest in implementing security information and event management (SIEM) solutions.

 
 

Another hot spot in 2021 cyber security spending, according to Forbes, is identity and access management (IAM), which is a prime focus for 30% of business leaders investing in cyber security. Some of their projects regarding access and identity management include:

 
 

With uncertainties still forthcoming, some CIOs are worried about guaranteeing a high level of cyber security with a limited budget while facing challenges in approaching the topic of online security to a diversified and remote workforce.

 

Investing in Cloud

 

Based on CIO investment feedback from the interviews, most of them are still in the planning stage of their cloud strategy with cloud integration and migration as their core priorities.

 
 

Microsoft Azure, Amazon Web Services, and Google Cloud are three of the most popular cloud platforms in the market, and interviewed decision-makers are contemplating between the cloud computing services while some are even working with all three of the platforms.

Alternatively, a group of IT leaders and other key C-suites are working towards a hybrid cloud environment, which is commonly used in industries such as:

What is Your Focus Area?

 

As seen in our survey findings and interviews, each of the IT leaders is prioritizing a specific solution that best serves their target goals with consideration to their budget, their available expertise and IT talents, and current processes.

For some, the immediate focus is on surviving the consequences of the pandemic, “which has become the number one objective for most emerging technology investments”, according to KPMG’s research. For others, it’s an opportune time to shift to a more digital business model and accelerate their digital transformation.

Nevertheless, while benchmarking and taking note of emerging IT trends help your organization to measure business performance against other companies, the global situation and market uncertainty are still expected to significantly affect information technology investments.

The important thing is to have a solid focus on your strategic IT priorities, adopting agility and adaptability for business continuity, and making smart investments to prevail in the long term.

5G in Europe: Deployment & Practical Uses

During the first half of the year 2020, we witnessed an unparalleled, accelerated evolution to mass home-working, living, and leisure during the pandemic. This has uncovered the significance and critical aspects of our network infrastructures. Some of these changes in behavior and working are likely to persist or re-emerge in the years to come.

 

Increased bandwidth demands combined with the fluctuating geography of mobile usage and longer peak hours, highlight the need for innate investment in telecom infrastructure, and a regulatory framework that encourages this.

 

 

The European Round Table for Industry (ERT) recently released an Assessment paper on 5G technology roll-out in Europe. The ERT summarized that Europe and some of the largest individual nations are well behind global competitors in deploying 5G technology.

 

However, the European Union is taking steps towards redressing the situation as it moves to full 5G coverage in the near future. A very possible endeavor seeing as two out of the top three global companies, Ericsson and Nokia are also European companies, that continue to lead the global 5G rollout race.

 

First Steps Towards Practical Uses of 5G

Irrespective of whether the business is a start-up, a corporation, or an LLC, 5G networks will reimagine business as we know it.

Manufacturing optimizations

In Sweden, Atlas Copco Industrial Technique recently installed a private 5G network in the integration lab to bring operational efficiencies and cost savings to the manufacturing process. training & skills acquisition

 

This is one of the first 5G implementations for industrial purposes in the world and one used to develop 5G ready industrial tool solutions for customers worldwide.

 

2018 also saw telecommunication giants Nokia and Telia Company AB conduct what is being described as the first “real-world” industrial applications of 5G manufacturing.

Both companies leveraged the ultra-low latency, high-bandwidth capabilities of 5G to support time-critical applications, enhancing production and efficiency in a manufacturing environment.

 

 

As the demand for manufacturing optimizations increases, join in the discussion about smart and lean production advancements in modern industrial organizations at our 600Minutes virtual events, IndustryForum on Smart Machinery and Services (Finland), Smart Manufacturing, and the Executive Club Manufacturing in Sweden.

 

Retail

Amazon Go

Amazon Go is a recent example of a retail outlet with the potential to transition seamlessly to a staff-less experience. Amazon utilizes dozens of sensors to provide real-time inventory visibility and update pricing according to demand. It is a similar technology to those used in self-driving cars to automatically detect when products are taken from or returned to the shelves, keeping track of the products in a virtual cart. Once completed, shoppers can simply walk out of the store with the products. The system then charges their Amazon account and issues a receipt.

 

In essence, improved connectivity empowers retailers to monitor customer behavior more closely and make educated decisions to better engage shoppers, increase sales, and reduce operating costs.

 

Join our virtual events for the chance to exchange thoughts and ideas with fellow business leaders on the future of retail: The 6th Annual European Strategy Forum (The Netherlands), IndustryForum Retail (Germany), and the Executive Club Retail in Sweden.

 

Smart Homes and Cities

Steelcase partnered with Ericsson to create the next generation office environment using 5G and IoT – Steelcase WorkLife CenterSmart City

Together, they are testing new use cases for the workplace that will allow team members to collaborate more efficiently and effectively. Employees will be able to create smarter, connected solutions, for example, ones that can serve as an interconnected layer to support digital wayfinding, asset location, and room scheduling

 

This web of connectivity will enable maintenance of the infrastructure and manufacturing systems, as well as robust flow control, adjustment, and fine-tuning of operating parameters to respond to real-time fluctuations in the environment and processes, as they occur.

 

The demand for urbanization continues to grow in vast cities. Meet other industry leaders and join the discussion on the smart and sustainable development of our future cities at Smart Cities & Sustainable Societies and Smart Buildings & Facilities in Sweden.

 

Healthcare

During the 5G Healthcare Vodafone Conference & Experience Day in Milan, remote surgery operation was carried out for the first time in Italy over a 5G network in collaboration with the Italian Institute of Technology (IIT) and the IRCSS Hospital San Raffaele.
Professor Matteo Trimarchi performed the procedure from the Vodafone Village on a synthetic larynx model at the San Raffaele hospital, at the opposite side of the city.

 

The Government of Catalonia, SEM, Vodafone, i2CAT, IECISA, and 5G Barcelona are working together to develop advanced communication tools for 5G connected ambulances, highlighting how 5G networks affect critical areas such as healthcare.

 

Connected ambulances are already being used to receive specialized remote real-time HD video support while carrying a patient, and ambulances will soon also gain Vehicle-to-Vehicle (V2V) and Vehicle-to-Infrastructure (V2i) capabilities to ensure they have access to clearer roads on the way to the hospital.

 

Explore the latest innovations and solutions that are aimed at improving health outcomes for patients and communities in the digital age at the IndustryForum Healthcare (The Netherlands) and the IndustryForum Hospital Healthcare (Germany).

 

 

Connected Transportation

The future of transportation is accelerated. New business models, limitless consumer experiences, and financial opportunities in the industry appear almost daily.Smart City

 

According to Intel, connected cars will save 250 Million commuting hours and increase productivity gains by US$507 Billion by 2030. The transportation industry also stands to gain over US$6 Trillion from pilotless vehicles, autonomous business fleets, and ride-hailing services.

 

Ericsson and Telia in partnership with Einride are developing an Autonomous Electric Transportation (AET) solution for next-gen driverless vehicles. AET is an example of how 5G facilitates all-electric road freight transportation with the potential to reduce CO2 emissions by 90%, and eliminate harmful NOx emissions and ultrafine soot particles.

 

Conclusion

Businesses stand to benefit significantly from the increased speeds, reliability, and power provided by 5G infrastructure. New and existing technologies such as the Internet of Things (IoT), Smart Cities, Big Data, Autonomous Vehicles, Virtual Reality (VR), and Augmented Reality (AR) can reach new heights.

5G networks will undoubtedly enhance the speeds at which data is transferred from point to point, directly transforming how businesses work and operate to become more resilient and competitive. With increased productivity, companies will experience increased revenue and significantly boost the EU economy.