Ecosystem Archives - Enterprise Viewpoint https://enterpriseviewpoint.com/category/ecosystem/ Vistas Beyond the Vision Wed, 02 Aug 2023 14:24:43 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.1 https://enterpriseviewpoint.com/wp-content/uploads/2017/01/Enterprise-ViewpointEVlogo-1-150x150.png Ecosystem Archives - Enterprise Viewpoint https://enterpriseviewpoint.com/category/ecosystem/ 32 32 Documenting Microsoft Environments Efficiently With Ease https://enterpriseviewpoint.com/documenting-microsoft-environments-efficiently-with-ease/ Tue, 01 Aug 2023 12:42:59 +0000 https://enterpriseviewpoint.com/?p=14932 Microsoft solutions like Microsoft 365, Active Directory, Azure AD and Microsoft Intune are popular worldwide. Around 70% of organizations rely on Microsoft Azure for cloud services, and over a million companies use Microsoft 365 to collaborate and be productive. However, as an IT professional that manages Microsoft multitenant environments, having a holistic view of your […]

The post Documenting Microsoft Environments Efficiently With Ease appeared first on Enterprise Viewpoint.

]]>
Microsoft solutions like Microsoft 365, Active Directory, Azure AD and Microsoft Intune are popular worldwide. Around 70% of organizations rely on Microsoft Azure for cloud services, and over a million companies use Microsoft 365 to collaborate and be productive.

However, as an IT professional that manages Microsoft multitenant environments, having a holistic view of your entire IT environment — knowing exactly what assets are on the network, who has them, where they are, how your assets are connected and what software is on them — can be challenging.

So, how do you view your entire IT environment in a single pane? Let’s find out.

What great documentation looks like

When your information is scattered across multiple fragmented solutions, it’s hard to find, track and manage everything. Your team wastes precious productive hours looking for information that may not even exist. Employees lose roughly 25% of their time looking for information when it’s stored in five or more fragmented systems.

Compare this with a situation where your technicians can find all the information they need about every asset, license, user and password in a single pane. The information is always up to date and accurate in your organization’s only source of truth.

This is the dream documentation scenario, and it is certainly achievable.

Jumpstart documentation with intelligence-driven templates

While great documentation is critical to the success of any IT team, it isn’t always clear what great documentation looks like. That’s why you need intelligence-driven, pre-configured documentation templates to get going.

These templates “think for you” and are designed on best practices recommended by IT experts who have “been there, done that,” helping you eliminate time wasted on planning and administration.

Although these templates are tested and validated, they can still be customized to suit your business processes. They let you combine the collective intelligence of experts with your own expertise to meet your unique needs. They are also flexible, boost the consistency and efficiency of your operations, and help reduce errors.

Consolidate documentation with out-of-the-box integrations

An IT pro typically uses over 15 tools to manage their IT environments, including Microsoft 365, Active Directory, Azure and ServiceNow.

To get a holistic view of such a disparate system, you need an IT documentation tool that integrates with a wide variety of solutions and extracts and presents information in a useful way.

In other words, you need intelligence-driven integrations that convert unstructured data from multiple information sources into structured, consolidated and actionable information.

Connect Microsoft solutions with a documentation tool

Now that we have established the significance of bringing together all your Microsoft data in a centralized IT documentation tool, let’s find out how you can achieve it.

You need a documentation solution like IT Glue to integrate multiple solutions and bring the information together. Connecting multiple Microsoft solutions into a centralized IT documentation platform, like IT Glue, enables automatic, accurate and seamless information flow. It supercharges your asset, license and user management. Since the information flow is automated, it is always updated.

To understand “what” is in your network, you can connect a device management solution, like Microsoft Intune, that lets you track all your assets and device details, like device name, model, serial number and compliance status, from one place. For software assets, you can integrate with Microsoft 365 to pull licensing information and see how many licenses are active, consumed and unused.

To understand the “who” and “where,” utilize productivity and user management solutions like Microsoft 365, Active Directory and Azure. They let you extract details like AD status, last login, last password reset and password expiration.

After consolidating the “what,” “who” and “where” of your network, it’s time to connect these with the “how” and “why.” We’re talking about your processes and organizational knowledge. When these exist in someone’s head, they can easily leave your organization. That’s why it’s important to document them as well in a centralized solution.

Fix information sprawl with intelligent documentation

When you deploy multiple point solutions that are managed separately for each IT domain, application, network or infrastructure, you end up with a siloed IT infrastructure and unorganized documentation that offers no starting points to solve problems. It leaves technicians feeling confused and overwhelmed.

Intelligence-driven IT documentation lets you establish order in this chaos by extracting key information from various solutions, like Microsoft, and consolidating them in a structured format. It allows you to execute processes consistently and increase the efficiency of your operations.

Learn more about how you can eliminate information sprawl with IT Glue’s intelligence-driven IT Documentation Solution.

The post Documenting Microsoft Environments Efficiently With Ease appeared first on Enterprise Viewpoint.

]]>
Nvidia’s Supply Chain Forecasting AI Journey With SAP IBP and Nvidia AI https://enterpriseviewpoint.com/nvidias-supply-chain-forecasting-ai-journey-with-sap-ibp-and-nvidia-ai/ Fri, 16 Jun 2023 15:56:13 +0000 https://enterpriseviewpoint.com/?p=14750 Modern supply chain planning is a complex and dynamic process that involves the coordination and integration of multiple stakeholders, including suppliers, manufacturers, distributors, and customers, in order to deliver products and services to the end consumer. It encompasses all aspects of the product lifecycle, from sourcing and procurement to production, logistics, and delivery. Just like […]

The post Nvidia’s Supply Chain Forecasting AI Journey With SAP IBP and Nvidia AI appeared first on Enterprise Viewpoint.

]]>
Modern supply chain planning is a complex and dynamic process that involves the coordination and integration of multiple stakeholders, including suppliers, manufacturers, distributors, and customers, in order to deliver products and services to the end consumer. It encompasses all aspects of the product lifecycle, from sourcing and procurement to production, logistics, and delivery.

Just like the entire semiconductor and high-tech industries, Nvidia faced tremendous supply chain challenges in the past few years.  We’ve seen large volatilities in demand.  COVID-19 lockdowns brought a surge in demand for Nvidia products due to the shift to remote work and online learning.  At the same time travel restrictions and factory closures caused delays in production and shortages of critical parts.  Trade disputes and tariffs, e.g., between the US and China have led to increased costs and disruptions to the supply chain.

Having to tackle all these challenges simultaneously the past few years required us to adopt new models, technologies and algorithms to help us sustain our rapid revenue and profit growth. We believe we needed the following three elements in our solutions.

At the core is a Supply Chain Planning (SCP) system which is an integrated demand and supply planning system that enables consolidation of data from different sources to create a unified holistic view. Complementing the SCP would be predictive analytics models leveraging traditional statistics and AI/ML that will be able to handle high fluctuations in demand and understand the complexity from multiple demand indicators. Finally prescriptive optimization models leveraging operations research algorithms are needed to incorporate material and capacity constraints into the supply chain plans.

We began our journey by modernizing our Forecasting system first. After an extensive evaluation of multiple tools, we selected SAP IBP. SAP IBP is a cloud-based solution, offering flexibility, scalability, and easy accessibility. IBP integrates seamlessly with other SAP solutions, such as SAP ERP, BW and APO, enabling a unified planning process. Our planners need options on UI and only IBP offers both a web-based UI along with Excel add-ins.

It took us 9 months to implement IBP Demand Planning for our consumer business units and another 9 months for our enterprise business units. We now have a total of 250+ users spread across the globe. We had moved 12+ separate Excel models for the different Nvidia business units into IBP resulting in more consistent and improving forecast results. We estimate about 10K+ hours of productivity savings per year for our planners. Cycle time from sell through forecast to revenue target setting has been reduced from 8 days to 4 days.

SAP IBP offers advanced analytics and scenario modeling capabilities that we use for our simpler use cases. But some of our use cases require more advanced AI/ML models such as sell through forecasting of our desktop and notebook business units.

Artificial intelligence (AI) has been seen as a game changer in supply chain planning specifically areas covering demand forecasting, inventory optimization, route optimization and risk management. In NVIDIA’s case there are two main areas AI can greatly help, these are predictive analytics and prescriptive optimization.

Traditional forecast algorithms do not have the capacity and flexibility to consume large and diverse data to understand the complex relationships of different factors to generate a more accurate forecast.  They are also very rigid in terms of their adaptability to handle changing trends or unexpected events. This is due to their inherent ability to incorporate external factors that can significantly help in creating a richer forecasting model. But perhaps the most significant downside of using traditional forecasting methods is that they are often time consuming and labor-intensive, thus making them more prone to human error. This problem becomes compounded when the planner is handling tons of data. The planner requires significant manual effort to gather and analyze data and in between may result in incorrect data entry or assumptions.

Not only can AI-based forecasting handle more complex data, it’s also very adaptive, can process large amounts of data much faster and reduces the human factor. AI-forecasting models can learn from past data and adapt to new information in real-time, making them more accurate and flexible in their predictions. They can also adjust their models based on changing trends, which allows them to provide more accurate forecasts. Moreover, since AI is mostly based on deep learning models which can leverage high-performance computers like graphical processing units (GPUs), it can process large amounts of data in parallel. Finally, since most of the manual process are fully automated it reduces risk of human error.

Let’s walk through an example of how we use SAP IBP extensibility for Forecasting.

At Nvidia our consensus forecast is our key data element that drives both our revenue forecast w/ Wall Street as well as our build plans.   As you can see in the diagram below, the first step in our forecasting process is generating an unconstrained consensus forecast within SAP IBP plus external advanced AI/ML models in Nvidia Kratos processing historical data, various demand factors and planner judgment.

In the second step we run mixed integer programming models to generate supply constraints against the unconstrained forecast.  We run the model in a custom Gurobi solver we built internally. We are currently migrating the solution to our own Nvidia Cuopt optimization product that leverages CUDA for parallel GPU processing at lightning speeds

In the third and last step, we generate our constrained forecast that will be the basis of our revenue forecast and build plans

 

 

 

 

 

 

 

Due to the interconnectedness of global trade coupled with the rise of e-commerce, a company’s supply chain must adopt to the challenges of global disruptions (Covid, tariffs) and satisfying increased demand with supply constraints. As we have discussed, SAP IBP plus external models like Gurobi and Nvidia AI (Kratos) can meet the scale and complexity of these challenges.

The post Nvidia’s Supply Chain Forecasting AI Journey With SAP IBP and Nvidia AI appeared first on Enterprise Viewpoint.

]]>
How to prepare for the cloud-first future https://enterpriseviewpoint.com/how-to-prepare-for-the-cloud-first-future/ Fri, 26 May 2023 13:13:53 +0000 https://enterpriseviewpoint.com/?p=14632 Enterprises are repositioning themselves for the next wave of economic trends by prioritizing configurable, cloud-based software packages By Aditya Kamalapurkar, Head of SAP Cloud Business and Sustainability at Capgemini Americas In recent years, macroeconomic trends have accelerated transformation agendas across the software vendor market. Supply chain issues, increased energy costs, ESG regulations, and the race […]

The post How to prepare for the cloud-first future appeared first on Enterprise Viewpoint.

]]>
Enterprises are repositioning themselves for the next wave of economic trends by prioritizing configurable, cloud-based software packages

By Aditya Kamalapurkar, Head of SAP Cloud Business and Sustainability at Capgemini Americas

In recent years, macroeconomic trends have accelerated transformation agendas across the software vendor market. Supply chain issues, increased energy costs, ESG regulations, and the race to adopt the latest and greatest digital capabilities have collectively reshaped the ways in which organizations operate. In turn, they have also altered expectations and are turning away from traditional monolithic ERP packages with multi-year contracts. Many enterprises are not only looking for customizable capability ecosystems, but they’re also searching for partners with coherent messaging and collaborative approaches to understand how the right mix of specific capabilities can meet their unique business needs.

Hyperscalers and software vendors are well aware of the changing tides, and they are correcting course to meet their consumers’ needs. To do so, software vendors have accelerated their cloud-first strategies and cloud-based offerings—all while adjusting their go-to-market agenda. This is to prepare for a future in which clients will be able to cherry pick cloud offerings to create configurable, microservices-based, and API-led cloud software packages via a flexible contract structure and pricing model.

This may not seem like a new concept. We see it all the time in business. Economic trends force clients to alter their business strategies and their expectations from suppliers shift as a result—which then causes the supplier in question to meet the new market expectations with their own adjusted business models. However, we have not seen such a significant change in the software vendor market since the software heyday nearly twenty-five years ago.

Interestingly enough, vendors are on track to meet clients’ shifting expectations, but the adoption is varied. In the next ten years, I foresee the creation of ecosystems that will jointly and successfully advance offerings such as ready-to-run cloud-based ERP and business application platforms. This is key to constructing any offering strategies in such a way that these capabilities can seamlessly be plugged into tailored packages. And in the shorter term, vendors will provide companies with advanced business process transformation and sustainability offerings that can not only be utilized in the current capability ecosystem, but that can also be folded into the next generation of truly SaaS-based ERP packages. That’s great news, but again this promising momentum has the potential to leave organizations on the back foot if they do not have implementation strategies in place to make the most of next-generation solutions.

As the North American Lead for RISE with SAP at Capgemini, I’ve engaged in hundreds of client discussions with CxOs since its launch. Many leaders ask me about the long-term vision behind these cloud-first strategies as they not only struggle to select vendors or offerings, but they also are concerned about their organizations’ implementation plans for the new era of ERP offerings. Regardless of the client, I encourage every organization to spend time getting the fundamentals right.

Companies should always ask two questions: “what is the objective and what happens once I go the RISE with SAP route?” Once organizations are able to understand how they measure key priorities, metrics, and requirements, they can then determine what offer and service packages will best suit them in the long term. This evaluation period also requires enterprises to watch the vendor market and study how their capability transformations are unfolding in order to then determine their own clean-core maintenance and microservices-based API-led architecture plans. Based on these findings and decisions, clients will have the ability to map out entitlements from vendors’ bundles and run pricing models for an apples-to-apples comparison.

After this process is completed, they’ll need to get their ducks in row, so they’re prepared to implement their ERP. If not planned well, this will be a challenging process. Leaders should recognize that this planning stage is not just about swapping out technologies; you’re ultimately changing your business process and model, and therefore nailing down business value proposition is key.

To tackle this daunting stage and ensure that organizations are prepared to make the most of vendors’ next-generation products and service models, consider the following tactics:

  • Improve business processes and agile tactics
  • Establish cyber, data, sustainability, and engineering strategies aligned to business goals and desired cloud-based applications
  • Focus on upskilling and reskilling talent
  • Identify technology bottlenecks, strategically off-boarding certain legacy solutions and on-boarding new capabilities that will pair well with your ERP package
  • Develop future assessment protocols to ensure your ERP package is effectively integrated after the implementation period

We are entering a new age in the software industry that will not only impact vendors’ business models, but also that of their clients. Cloud-first strategies will reign supreme and configurable ERP software packages will soon be available. However, with so much progress being made on the vendor side of the market, clients must also work to ensure their organizations are prepared to implement the offerings and adjusted partnership agreements that they’ve desired. Only then will they see true business harmony.

BIO: Aditya Kamalapurkar heads the SAP on Cloud Business for Capgemini Americas and globally leads the SAP Sustainability and Cybersecurity GTM. He co-leads the Women in SAP charter for North America. Adi has a background in Electrical and Electronics Engineering with a master’s degree in technology management from the TU Delft, Netherlands. He is based out of San Francisco.

The post How to prepare for the cloud-first future appeared first on Enterprise Viewpoint.

]]>
Real-life Security Stories: How Companies Are Tackling Cyber Threats with Cloud ERP https://enterpriseviewpoint.com/real-life-security-stories-how-companies-are-tackling-cyber-threats-with-cloud-erp/ Fri, 26 May 2023 12:54:45 +0000 https://enterpriseviewpoint.com/?p=14621 In today’s fast-paced and interconnected business landscape, organizations face numerous security challenges. Among these concerns are the security risks associated with legacy systems. Outdated, on-premises solutions often lack the robust security measures required to combat modern cyber threats. However, the emergence of Cloud Enterprise Resource Planning (ERP) systems has revolutionized how businesses address these security […]

The post Real-life Security Stories: How Companies Are Tackling Cyber Threats with Cloud ERP appeared first on Enterprise Viewpoint.

]]>
In today’s fast-paced and interconnected business landscape, organizations face numerous security challenges. Among these concerns are the security risks associated with legacy systems.

Outdated, on-premises solutions often lack the robust security measures required to combat modern cyber threats. However, the emergence of Cloud Enterprise Resource Planning (ERP) systems has revolutionized how businesses address these security challenges, providing enhanced protection and peace of mind.

In this article, we will explore success stories of companies that recently transitioned to Cloud ERP; highlighting their achievements in terms of scalability, strengthened infrastructure security, centralized access controls, enhanced security patching and updates, advanced threat detection, incident response and disaster recovery.

  1. Strengthened Physical and Infrastructure Security:

A rapidly growing manufacturing company that relied on an on-premises ERP system struggled to secure the budget necessary for implementing the latest-generation physical security measures, such as physical access controls and surveillance systems. These measures incurred significant costs and required dedicated resources.

Consequently, the company decided to migrate to Cloud ERP to leverage their robust physical and infrastructure security. The cloud provider implemented state-of-the-art physical security measures, including biometric access controls and surveillance systems. They also have network security protocols such as firewalls and intrusion detection systems to protect against external threats. Furthermore, data is encrypted using industry-standard encryption algorithms, ensuring that even in the event of unauthorized access, the data remains secure and unreadable.

  1. Application Data Security and Access Control:

A multinational corporation with various departments and subsidiaries worldwide faced challenges due to fragmented data storage across various legacy applications.

This fragmented storage resulted in inconsistent data security measures and complex access controls, making it difficult to enforce uniform security protocols, maintain data integrity, and efficiently manage access authorizations.

By transitioning to Cloud ERP, the company centralized its data storage and implemented robust access controls, granting different levels of data access to employees based on their roles and responsibilities.

Additionally, they implemented a reveal-on-demand functionality where masked data could become accessible to the user on the fly if approved by the data owner, who would receive an immediate notification on their mobile phone.

This centralized and flexible approach streamlined data security management, ensuring that the right people have access to the right information, speeding up business processes while reducing the risk of unauthorized data breaches.

  1. Enhanced Security Patching and Updates:

A large healthcare organization struggled to apply security patches and updates in a timely manner with their legacy systems. This process often required manual intervention and resulted in lengthy system downtime, leaving systems vulnerable to potential attacks and hindering efficient business operations.

Since adopting Cloud ERP, the organization now benefits from centralized security patching and updates handled by the cloud provider. The provider regularly releases patches to promptly address newly identified vulnerabilities, minimizing the window of opportunity for potential attacks without causing disruptions to the business operations.

  1. Advanced Threat Detection and Incident Response:

A popular e-commerce company had invested significantly in a Security Information and Event Management (SIEM) Technology to identify potential breaches at the infrastructure layer. However, they lacked advanced threat detection capabilities at the application level, where their sensitive data and business processes resided. This created concerns about the safety of their data and reputation.

Upon migrating to Cloud ERP, the company gained built-in security monitoring tools and machine learning algorithms which they needed at the application level. These tools correlate application user behaviours, network traffic, and system logs, promptly detecting anomalies and potential threats to sensitive business data and processes. In the event of a security incident, the Cloud ERP provider’s dedicated incident response team takes immediate action to mitigate the impact, minimizing potential damage to the company’s reputation and customer data.

  1. Scalability and Disaster Recovery:

A fast-growing technology start-up operated on an on-premises ERP system that struggled to accommodate their increasing demands for scalability. Scaling resources involved lengthy procurement and deployment processes, hindering their ability to respond quickly to business growth. Additionally, the organization lacked robust disaster recovery measures, making them vulnerable to extended downtime and data loss in the event of a system failure or disaster.

They now transitioned to Cloud ERP, where they can quickly unlock new system capabilities as needed and explore new business opportunities. Furthermore, they now have robust disaster recovery strategies in place. Data is replicated across multiple geographical locations, and in the event of a natural disaster or system failure, they can quickly recover data and systems, minimizing downtime and ensuring uninterrupted business operations.

Conclusion:

These examples demonstrate the significant security challenges posed by legacy systems for organizations aiming to protect their critical data and systems.

The migration to Cloud ERP presents an opportunity to effectively address these challenges by leveraging a robust infrastructure with centralized data security, regular patches and updates, advanced threat detection, incident response capabilities, and reliable disaster recovery.

In all these success stories, the adoption of Cloud ERP empowered companies to fortify their security measures, supporting the availability, integrity and confidentiality of their business data and processes. This enabled them to concentrate on their core business objectives while unlocking innovation and scalability to propel their growth.

The post Real-life Security Stories: How Companies Are Tackling Cyber Threats with Cloud ERP appeared first on Enterprise Viewpoint.

]]>
Enterprise Legacy system application in the context of SAP S/4HANA migration! https://enterpriseviewpoint.com/enterprise-legacy-system-application-in-the-context-of-sap-s-4hana-migration/ Thu, 18 May 2023 06:29:48 +0000 https://enterpriseviewpoint.com/?p=14562 We’re in the move to S/4HANA The evolution of IT systems is an inevitable element in a corporate environment. It means that from time to time, some corporate applications need migration to a new one – the same is the case with SAP ERP systems. For the past 8 years, SAP has been promoting the […]

The post Enterprise Legacy system application in the context of SAP S/4HANA migration! appeared first on Enterprise Viewpoint.

]]>
We’re in the move to S/4HANA

The evolution of IT systems is an inevitable element in a corporate environment. It means that from time to time, some corporate applications need migration to a new one – the same is the case with SAP ERP systems. For the past 8 years, SAP has been promoting the replacement of its flagship SAP ECC solution with the SAP S/4HANA solution. In this case, this migration encompasses the implementation of different software (it is not just an upgrade), the adoption of a different database (if you were not already using HANA DB), and a different database model along with a full migration project.

Different migration options

The current migration projects are more complex than they used to be in the old days. Today, you can rely on pre-configured solutions such as the SAP Migration Cockpit or Selective Data Transition (SDT) solutions. You may also go for a more technical solution called the Brownfield migration. This approach carries over as many custom components as possible from the source system and minimizes the initial re-engineering efforts.

S/4HANA migration provides no traceability back to the original ERP system.

While the solutions are handy for migration, all these solutions come with a common pitfall – there is no migration traceability. That means it is not possible to prove that all data was migrated, nor that it had remained unchanged (it’s the original data) or un-filtered (we have all data). In other words, if you cannot prove that the data has remained complete and unchanged, then the original legacy system must be kept, ensuring it meets all the compliance requirements. All in all, this is fine – we don’t need migration traceability; however, what we certainly want is an efficient new system.

It is imperative to ensure that your new system does not carry over historical data from the past; the new system should support improved business processes to answer the marketplace demands. The migration target is to build a great new system for managing future business processes in an improved way, providing better customer service with increased profits.

So, we could keep sunset (legacy) systems…

Now that we all agree the new system is not designed as a legacy archive, we must bear in mind that historical data is still a key source of information in many ways. Therefore, there is a strong need to retain access to legacy data –

  • to respond to audit or tax requests
  • in case previous data and documents need accessing.

From time to time, you must possess data you can trust to show in court if required.

So sunsetting a legacy system (in a read-only status) is a plausible option, but it doesn’t come risk-free. These are some of the challenges that come along, namely –

  • Running legacy systems is costly.
  • They increase the “technical debt.”

Practically, keeping those sunset systems is highly unadvisable for the following reasons:

  • Common usages, like Virtual Machines, are becoming far too vulnerable to frequent cyberattacks.
  • Since GDPR came into force in 2018, most countries and several US States have been pushing for heavy financial and reputational penalties because of breaking data privacy laws. Legacy systems usually do not enforce data privacy requirements; it is difficult to handle such obligations in these historic systems simply because they were not originally designed with that logic.

In the past, it could have been acceptable to pay for the maintenance of the legacy systems database or even to bear the burden of maintaining a Virtual Machine. Nevertheless, today keeping a legacy system alive requires a different budget.

Securing a system from day-to-day vulnerabilities encompasses an array of obligations – taking care of servers, applications, storage, networks, operating systems, data, and run time for most classical legacy systems. It means acting, at each level, on security, updates, authorizations, help desks, management, training, and third-party contracts.

Or we manage to decommission legacy systems!

Hence, a new type of application is emerging: “legacy system applications”. Such applications allow organizations to take one step forward, moving from system sunsetting to system decommissioning.

Legacy applications can deal with legacy reports, master transactional data, generated or linked documents (such as invoices, delivery documents, emails, etc.), and archives (SAP systems generate loads of SAP data archive files named ADK files) with utmost detail and traceability. Important to note that system documentation must be kept for audit purposes. The good news is that legacy systems are becoming a growing concern, and hence there are several options available in the market to deal with them.

Best practices to sunset systems in S/4HANA migration 

Firstly, when migrating to S/4HANA (or when migrating any application), decide the future of your sunset systems like –

  • If you want to consider deleting the system (I will not advise this for ERP systems).
  • If you want to keep the system, considering you are willing to pay a high price.
  • Sit on the fence – postpone the deletion of the system while storing it in a temporary Virtual Machine.
  • Create a tax archive for legal purposes (extracting data, reports, documents, and documentation in a controlled environment).
  • Or last, but not least, go for the implementation of an Enterprise Legacy System Application (ELSA)

Secondly, define a timeframe and a blueprint. I strongly advise dealing with the legacy systems as a separate project from the migration project.

Finally, if you decide to initiate a decommissioning project, you may run it in parallel with the migration project and reap the benefits of having access to legacy information. It is highly reassuring and positively impacts the more innovative and efficient IT system.

The final word

The final advice is to conduct a robust selection process when choosing the legacy system application solution, as this application will become the global future hub for all the running applications within the company.

Here are some essential features you may want to find in a legacy application and the corresponding IT supplier –

  • Ability to support tax & audit demands.
  • Capacity to quickly apply evolving data privacy regulations.
  • Ability to run the system safely and securely.
  • To stay updated with the latest security updates & prevent any vulnerabilities.
  • A user-friendly interface, ideally training-free for new employees to access historical information on old applications.

The post Enterprise Legacy system application in the context of SAP S/4HANA migration! appeared first on Enterprise Viewpoint.

]]>
Enabling ECC to S/4HANA Transformations Through Data Archiving https://enterpriseviewpoint.com/enabling-ecc-to-s-4hana-transformations-through-data-archiving/ Fri, 05 May 2023 06:21:56 +0000 https://enterpriseviewpoint.com/?p=14429 At a recent SAP event, PBS Software hosted a break-out session and asked “How many of you have moved or started to move your SAP systems from ECC to S/4HANA?”.  From 30 represented companies, 90% of those questioned have not migrated or started their journey.  It’s an astonishing number considering the time and effort to […]

The post Enabling ECC to S/4HANA Transformations Through Data Archiving appeared first on Enterprise Viewpoint.

]]>
At a recent SAP event, PBS Software hosted a break-out session and asked “How many of you have moved or started to move your SAP systems from ECC to S/4HANA?”.  From 30 represented companies, 90% of those questioned have not migrated or started their journey.  It’s an astonishing number considering the time and effort to plan a complex system migration.

The most common concerns from SAP customers seem to center around the migration approach and how to manage their data.

“Should we use a greenfield approach and take this opportunity to standardize or re-design our processes?”

“Maybe brownfield is better and we just convert what we have to a S/4HANA platform!?”

“We have multiple ERP system’s in our environment.  Since we’re moving, should we consolidate?”

There are a number of factors impacting those decisions.  Those often include availability of resources to support the migration, impact to operations, user needs, and policies.  Supporting business operations, ensuring reporting integrity, and supporting compliance or data retention requirements are key concerns for the end-state.

SAP data archiving and archiving solutions are often overlooked components of the transformation from ECC to S/4HANA.  While SAP includes standard functionality to archive data, many companies don’t have the expertise or knowledge with data archiving to understand how it can prepare them for the move.

Additionally, there are concerns with how the archive data may be presented to users or how archiving helps.  Using standard SAP functionality and enhanced archiving solutions from PBS Software, SAP customers can prepare early, address some easy wins for data reductions, and build confidence with their end-users for the move.

These are some use-cases where data archiving can help you prepare for the upcoming change.

  • Optimize your ECC data volume: Even if the journey to S/4HANA is expected to be years in the future, the ECC production database should be lean.  This will drive key decisions around the data migration and the future size of the S/4HANA environment that will support your data.

While most companies may have adopted limited or sporadic data archiving practices to address specific problem areas, this leaves a lot of easy opportunities to reduce your production database footprint.

With industry-leading archive retrieval solutions (Archive add ons) developed by PBS Software, IT organizations can increase the scope of data archiving in their ECC environment and still enable seamless access to archive data using enhanced archive-enabled SAP transactions.  While standard data archiving focuses on migrating business-complete data from the production database to alternative storage, end users still have the flexibility to access archive data using familiar transactions.  This gives customers the flexibility to reduce the production data footprint using SAP standard archiving while ensuring that all audit and compliance needs are supported.

Some companies have been able to reduce their production data footprint by up to 50% with aggressive data archiving initiatives.

For the future move to S/4, the archive files (ADK files) are moved to storage that can be accessed in the new landscape.  The benefit of this method is that the archive data remains in low cost storage.  (i.e. It’s not imported into the S/4HANA database).  The data can remain accessible in S/4HANA using compatible versions of the Archive add ons used in the ECC environment.

  • Move your content to the cloud early: Using a compliant content management solution like PBS ContentLink, businesses have the opportunity move structured and unstructured archive data to their preferred cloud provider in advance of their S/4HANA migration.

With this solution, you have the option to update your ECC content management environment to a lean system that supports common cloud technology.  The solution will support storage of SAP archive files (ADK’s), Archivelink documents, print-lists, and more.

This gives the flexibility to migrate to a platform that supports the future S/4HANA environment, reduces the dependency for on-premise storage, and places archive data in a platform that’s ready to connect to the future S/4HANA system.  With minimal infrastructure required, this cloud-ready solution is ready for testing in days.  When it’s time to build the S/4HANA environment, it provides the flexibility to connect to the existing blob storage in the new environment without the need to move the data again.

  • Stay compliant by maintaining legacy data

As discussed earlier, standard SAP data archiving gives customers the ability to move business-complete data to archive files, but what about the data that isn’t business complete?

Do you have an option to keep legacy data outside of the S/4HANA system?

Here’s a scenario to consider.  You’ve decided to use a greenfield implementation.  The ECC environment has 10 years of data.  You were able to archive 7 years of business-complete data that needs to move to S/4 and there’s 3 years of data that potentially can’t be archived.

The data retention policy requires that this data is retained for 15 years.  Or, you have legacy systems that have been maintained for compliance that you can’t simply abandon.

With solutions like PBS Nearline Analytic Archive (NAA), you now have the option to make a full system copy of a legacy SAP system to a side-car database (outside of S/4HANA).  This copy can be completed regardless of the record status.

From the S/4HANA environment, adapted transactions will allow users to query the data from the side-car database.  This solution provides the flexibility to maintain compliance and decommission old infrastructure while keeping legacy data out of the S/4HANA environment.

Finally, data archiving is not an after-thought for migration planning to S/4.  It’s a critical component to simplify the move of data, minimize the impact of data migrations, ensure compliance, and reduce your total cost of ownership.

The post Enabling ECC to S/4HANA Transformations Through Data Archiving appeared first on Enterprise Viewpoint.

]]>
How important is your time? Choosing the right platform https://enterpriseviewpoint.com/how-important-is-your-time-choosing-the-right-platform/ Fri, 05 May 2023 06:15:50 +0000 https://enterpriseviewpoint.com/?p=14426 Years ago, one might have thought that with the introduction of cloud-based solutions the landscape or enterprise architecture of organizations would have simplified.  As it turns out, the opposite has happened.  There are more choices in today’s technology market than ever before.  The “rush” to the cloud has been more of a “walk” for many […]

The post How important is your time? Choosing the right platform appeared first on Enterprise Viewpoint.

]]>
Years ago, one might have thought that with the introduction of cloud-based solutions the landscape or enterprise architecture of organizations would have simplified.  As it turns out, the opposite has happened.  There are more choices in today’s technology market than ever before.  The “rush” to the cloud has been more of a “walk” for many large organizations that are starting to consider what should and shouldn’t be cloud hosted.  Contrast this with midsize high-growth organizations who have the advantage of being flexible and not set in their ways (regarding business processes).  They are more apt to adopt a cloud-first mentality, fit to the standard through in-application configuration over customization.  One thing is for certain, being “cloud focused” is different depending on aspects like the size of the organization, its industry, the line of business, the criticality of the solution in terms of contributing to competitiveness, and more.  Organizations are asking themselves whether they need a cloud provider to host software they would have traditionally hosted in their own data centers (a “bring-your-own-license”, or BYOL model on a “Platform-as-a-Service” or PaaS) or whether they should seek a solution that is fully managed by the provider (a “Software-as-a-Service” or SaaS).  Some organizations are choosing not to move certain solutions in their landscape to the cloud at all.  A somewhat surprising dynamic is that some organizations are moving some business solutions back to on-premise!

Cloud. On-premise.  Hybrid.  Those are terms that describe “where”.  There’s also the change to “how”.  Enterprise application providers are moving away from monolithic to modular solutions, intended to address finite business needs.  They are finding ways to replace historically synchronous oriented solutions with asynchronous operations through an event driven architecture; thereby creating continuity when individual component failure occurs rather than wholistic business process failure.  Organizations must make difficult decisions about when to employ “best-of-breed” or “best-of-suite” solutions.  The decision is largely driven but the overall impact on the organization.  Does the “best-of-breed” solution provide contributions that accelerate the organizations business processes, allow for rapid adjustment to new business models, and promote the greatest level of throughput?  How does it differentiate from the “best-of-suite” capabilities?

Satisfying business process requirements seems more complicated than ever.  “Off-the-shelf” solutions can address most of the needs.  Perhaps as much as 70-90%.  Unfortunately, satisfying business process requirements is not like a classroom grade where the range of “C-“ to “A-“ is sufficient.  The remaining, unsatisfied portion of the business process requirements must be addressed.  Historically, this was done with multiple business applications, bolt-on and core application customizations, and manual steps.  It employed broad concepts like integration, application development, data management, data strategy, and analytics.  Today we need to include intelligent technologies like machine learning, robotic process automation, generative artificial intelligence, and more.  The collection of all these capabilities might be grouped as “platform” services.  They help organizations meet the remaining aspects of their business process requirements that are not fully addressed by “off-the-shelf” solutions.  If organizations, feel amazed with the choice of “off-the-shelf” solutions they are likely astounded by the number of platform related services in the marketplace.  There is a natural tendency to want to pick a standard cloud provider that hosts the largest volume of these services.  The belief is that by choosing one provider the organization can develop core competencies in platform services that can be deployed across the entire organization; thereby addressing all business process requirement gaps left behind by the “off-the-shelf” solutions.  This can work but it sidelines the most important asset every organization and we as individuals have – time.

There are some interesting data points, statistics, and polls that suggest that organizations in the US should be asking, at every turn, what is my time-to-value?  How long does it take to realize value from my investments?  There’s an interesting Gallup Poll that suggests “Quiet quitters make up at least 50% of the U.S. workforce – probably more…” .   So, what’s a “quiet quitter”?  These are employees that are doing the bare minimum.  They could be considered uninspired or disengaged from their jobs.  One might ask what this does to organizational productivity.  It can’t be a good thing.

U.S. employers might think they could overcome this problem by weeding out the “quiet quitters” and replacing them with more energetic contributors.  The problem is that the U.S. unemployment to job opening market is still somewhat upside-down.  As of March 2023, there are 6 people unemployed for every 10 job openings. This is not a good statistic if you are an employer.  It means those who are unemployed have lots of options so replacing “quiet quitters” with new employees from the pool of job seekers is not a quick solution.  This describes current conditions.  Is this likely to get better?  A little bit.  We’ll go from a current labor force participation rate of 62.6% to 63.2% by the beginning of 2024.  If the number of job openings remains static this increase would result in roughly 7 people unemployed for every 10 job openings.  This still isn’t a good statistic for the average employer.  If we summarize, these data points suggest that the U.S. marketplace has a problem with employee engagement.  There are plenty of opportunities for employees to seek new employment and that the conditions are not likely to improve soon.  This should be coupled with the fact that the most expensive cost of most organizations is their workforce and employees we are referring to, Information Technology specialists, rank among the highest paying human resources in the U.S. marketplace.  Reducing the time-to-value is probably the biggest expense impact on any organization.

Platform services should be complimentary to the off-the-shelf solutions selected.  They should reduce the time-to-value.  Platform providers with services that most-effectively complement their off-the-shelf solutions have an accelerating effect on addressing all business process requirements.  Native integration, prebuilt business content that maintains business context, that is offered and maintained by a “best-of-suite” provider equates to speed of delivery, simplicity, and ease of maintenance.  Time is arguably the most valuable asset.  Are your business applications and platform choices making the most of yours?

The post How important is your time? Choosing the right platform appeared first on Enterprise Viewpoint.

]]>
Google and Microsoft Are Gaining on AWS https://enterpriseviewpoint.com/google-and-microsoft-are-gaining-on-aws/ Tue, 02 May 2023 05:32:33 +0000 https://enterpriseviewpoint.com/?p=14402 Last week, all three of the three major providers of cloud services gave quarterly updates. One major pattern stood out: (GOOG -0.47%) and (GOOGL -0.13%) of Alphabet Both Microsoft’s (MSFT -0.55%) and Google Cloud AWS from Amazon (AMZN -3.22%) is losing ground against Azure. In the quarter that ended on March 31, 2023, Alphabet reported […]

The post Google and Microsoft Are Gaining on AWS appeared first on Enterprise Viewpoint.

]]>
Last week, all three of the three major providers of cloud services gave quarterly updates. One major pattern stood out: (GOOG -0.47%) and (GOOGL -0.13%) of Alphabet Both Microsoft’s (MSFT -0.55%) and Google Cloud AWS from Amazon (AMZN -3.22%) is losing ground against Azure.

In the quarter that ended on March 31, 2023, Alphabet reported a 28% year-over-year increase in Google Cloud revenue. Microsoft reported a 27% increase in income from Azure and other cloud services. AWS’s net sales increased, although by less than 16%.

Amazon doesn’t seem to be worried about the threats from Google and Microsoft, but some investors appear to be. Amazon’s shares fell last week, while Microsoft stock soared and Alphabet stock edged a little higher.

Is there truly nothing to worry about? I don’t think there is any real reason for concern.  Sure, Google and Microsoft are catching up somewhat thanks in large part to their AI initiatives. However, the bottom line is that Amazon, in Jassy’s words, “isn’t close to being done inventing in AWS.” My view has been and continues to be that Amazon, Alphabet, and Microsoft will be tremendously successful AI stocks over the long run.

 

The post Google and Microsoft Are Gaining on AWS appeared first on Enterprise Viewpoint.

]]>
Microsoft hits back at UK after Activision acquisition blocked https://enterpriseviewpoint.com/microsoft-hits-back-at-uk-after-activision-acquisition-blocked/ Fri, 28 Apr 2023 08:07:04 +0000 https://enterpriseviewpoint.com/?p=14384 The decision by the UK regulator to block Microsoft’s (MSFT.O) acquisition of “Call of Duty” creator Activision Blizzard “had shaken confidence,” according to Brad Smith, president of Microsoft. The purchase was halted on Wednesday by the Competition and Markets Authority (CMA), an independent agency of the government that said it may hurt competition in the […]

The post Microsoft hits back at UK after Activision acquisition blocked appeared first on Enterprise Viewpoint.

]]>
The decision by the UK regulator to block Microsoft’s (MSFT.O) acquisition of “Call of Duty” creator Activision Blizzard “had shaken confidence,” according to Brad Smith, president of Microsoft. The purchase was halted on Wednesday by the Competition and Markets Authority (CMA), an independent agency of the government that said it may hurt competition in the young cloud gaming business.

Microsoft hit back on Thursday, saying it was “probably the darkest day in our four decades in Britain” and sent the wrong message to the global tech industry about the UK. “If the government of the United Kingdom wants to bring in investment, if it wants to create jobs it needs to look hard at the role of the CMA, the regulatory structure in the United Kingdom, this transaction, and the message that the United Kingdom has just said to the world,” he told BBC radio.

A spokesman for British Prime Minister Rishi Sunak said Smith’s comments were “not borne out by the facts”. “We continue to believe that the UK has an extremely attractive tech sector and a growing games market,” he said. “We will continue to engage proactively with Microsoft and other companies.”

Smith asserted that Microsoft had successfully collaborated with authorities in Brussels but not in London, disproving Britain’s assertion that it will be more accommodating following Brexit. He said that once the corporation responded to the CMA’s inquiries, it instructed them to follow up with any more questions. He stated, “They fell silent; we heard nothing from them. The European Union is a more desirable area to establish a firm if you want to eventually sell it than the United Kingdom, he continued.

However, CMA Chief Executive Sarah Cardell stated that the regulator’s responsibility was to ensure that Britain was a market where firms could expand and prosper.

The post Microsoft hits back at UK after Activision acquisition blocked appeared first on Enterprise Viewpoint.

]]>
AWS debuts generative AI https://enterpriseviewpoint.com/aws-debuts-generative-ai/ Mon, 17 Apr 2023 05:42:06 +0000 https://enterpriseviewpoint.com/?p=14308 Amazon Web Services announced an API platform named Bedrock, which hosts generative AI models built by top startups AI21 Labs, Anthropic, and Stability AI. Generative AI has exploded in popularity with the development of models capable of producing text and images. Commercial tools developed by buzzy startups like OpenAI and Midjourney have won tens of […]

The post AWS debuts generative AI appeared first on Enterprise Viewpoint.

]]>
Amazon Web Services announced an API platform named Bedrock, which hosts generative AI models built by top startups AI21 Labs, Anthropic, and Stability AI.

Generative AI has exploded in popularity with the development of models capable of producing text and images. Commercial tools developed by buzzy startups like OpenAI and Midjourney have won tens of millions of users, and Big Tech is now rushing to catch up.

While Microsoft and Google compete to bring generative AI chatbots to search and productivity suites, Amazon’s strategy is to remain fairly neutral – like some kind of machine-learning Switzerland – and provide access to the latest models on its cloud platform. It’s a win-win for startups that have agreed to work with the e-commerce giant. Developers pay to use APIs to access the upstarts’ models, and AWS provides all the underlying infrastructure that fully manages and provides those services.

“Customers have told us there are a few big things standing in their way today,” said Swami Sivasubramanian, AWS’ veep of machine learning, in a blog post.

“First, they need a straightforward way to find and access high-performing [foundational models] that give outstanding results and are best suited for their purposes. Second, customers want integration into applications to be seamless, without having to manage huge clusters of infrastructure or incur high costs.”

Amazon Bedrock currently offers large language models capable of processing and generating text – AI21 Labs’ Jurassic-2 and Anthropic’s Claude – and Stability AI’s text-to-image model Stable Diffusion. Bedrock will also provide two of Amazon’s own foundation models under the Titan brand, not to be confused with Google’s Titan-branded stuff.

Developers can build their own generative AI-powered products and services on the backs of these Bedrock-managed APIs and can fine-tune a model for a particular task by providing their own labeled training examples. Amazon said this customization process would allow organs to tailor neutral networks to their particular applications without worrying if their private training data will leak, be misplaced, or be used to train other large language models.

The post AWS debuts generative AI appeared first on Enterprise Viewpoint.

]]>