Challenges of Migrating Legacy Applications to AWS

Challenges of Migrating Legacy
Applications to AWS

Introduction

In a world of rapidly changing technology, many organizations still rely on legacy mainframes to keep their most critical operations running. These age-old systems have been tuned and customized to meet the functional requirements of the business, and as a result, have become locked-in to vendors over the years. However, maintaining and supporting these systems can be a challenge, as resources are scarce, and the lack of an integrated testing environment can limit flexibility, add risk, and increase test time. In addition, legacy technology can encounter problems with maintenance, support, improvement, integration, and user experience.
Recently my team and I were discussing how to help our clients unlock the value of their industrial and manufacturing data. These industries often wrestle with terabytes of time-series data from a myriad of sensors, machines, processes and external sources. Each data source could potentially include different features with various formats, have non-rationalized time intervals and be generated from various proprietary technologies. The challenge of making this data available on a platform where workers can exploit the data and discover its hidden value can be overwhelming. Here is where the vision of AI Democratization often hits its first significant roadblock.

Managers who are tempted to feel elated at the prospect of having access to all the data they could ever wish for, soon realize it is a data sword of Damocles* hanging by a thread, ready to snap and bury them if they can’t find a way to unlock its value.
Richard Westall’s Sword of Damocles, 1812
* The parable of the sword of Damocles teaches us that no matter how good someone’s life may appear, it’s difficult to be happy living under existential threat.

So, what's the solution?

The answer lies in application, information, and data migration. By migrating to the cloud, organizations can improve their operational efficiency, reduce IT costs, improve performance, and take their business to the next level. Modern technology solutions can introduce automation to manual processes, which are prone to errors, and enhance reporting and rich featured UI and rules engine, allowing businesses to manage data more efficiently, and changes will be reflected in real-time.

The ultimate objective

The ultimate objective is to sunset the legacy system with minimal disruption to the business and transition towards a more robust and scalable information technology platform to support current and future business needs in a cost-effective and collaborative manner. This also involves designing a common technology platform for operational applications to minimize data redundancy, decrease the cost of building, integrating, and maintaining new and existing applications.

However, migrating from legacy mainframes to modern technology solutions is not without its challenges.
The key challenges during the migration process includes:
  • Rewriting application architecture for the cloud
  • Complexity of the integration of data, systems, and processes
  • Compliance and security
  • Dealing with hybrid networking setups
  • Investing in people and tools needed to migrate successfully
  • Training users on the new systems
To overcome these challenges, businesses need to have a clear set of guiding principles in place.
Consider these solutions and guiding principles:
  • Create a reference architecture for the legacy application to migrate to a cloud-native architecture on AWS.
  • Compliance & Security, Hybrid connectivity – AWS Accounts/VPCs, including TGW, Direct Connect Gateway, multi region peering, Landing Zones, VPCs AZs Subnets, Security Groups, IAM role
  • Data Security – Encrypted at Rest (AWS KMS), Encrypted at transit (SSL/TLS)
  • Real Time Transactions and Streaming, Messaging Integrations– SNS, SQS, MSK, Kinesis
  • Adapters – On-Prem to cloud protocol bridge
  • Use Serverless components/services as much as possible – Lambda, StepFunctions for workflow
  • AWS API Gateway – Lambda functions are invoked through API Gateway
  • Computation – Application container in EKS
  • ALB – EKS pods are invoked using ALB
  • AWS Secret Manager – Store credentials securely

In conclusion, the migration from legacy mainframes to modern technology solutions, such as AWS, is no longer an option but a necessity for businesses that want to remain competitive and agile. While the migration process may seem daunting, it can be successfully achieved with careful planning and execution, along with adherence to guiding principles. By leveraging AWS’s cloud-native architecture and services, organizations can improve operational efficiency, reduce costs, and enhance their overall competitiveness. With the right strategy and tools, the migration journey can result in a more robust and scalable information technology platform that meets current and future business needs.

Optimising aircraft turnaround time-a TCG Digital service offering

Optimising aircraft turnaround time
-a TCG Digital service offering

Introduction

Optimising aircraft turnaround time is a critical task for airlines looking to maximise efficiency and minimise costs. Delays in turnaround time can lead to lost revenue opportunities, as well as increased costs associated with aircraft operations. According to industry estimates, up to 15% efficiency can be achieved in current turnaround processes and technologies.

One of the most significant contributors to turn around delays is refuelling, accounting for a whopping 56% of all such delays. The typical cost for turnaround operations for a B737 is $70/hour.The good news is that a 25% uplift in refuelling efficiency can reduce turnaround time (TAT) by up to 3 minutes, which can translate into significant cost savings for airlines.For airlines with a fleet size of around 500 aircraft, reducing cycle time by 4-6 minutes can free up 2-3% of the fleet, potentially leading to cost savings of between $30-75M through TAT optimisation.

But how can airlines achieve these efficiencies? Our breakthrough solution for TAT Optimization offers significant benefits to both airlines and airports. By reducing the cost of operations and minimizing ground time, our solution enables better aircraft utilization and provides opportunities for airlines to operate on newer routes, ultimately leading to increased revenue opportunities for both airlines and airports. It utilises real-time feeds from airport cameras at gates, analysing video feeds in real-time through advanced AI/ML algorithms over a scalable cloud platform. The system analyses moveable and immovable objects on the tarmac, such as luggage carts, trolleys, cargo, fuel trucks, tugs, catering trucks, cleaning staff and equipment, and other objects, to determine turn events that could delay TAT. The system also generates pre-configured alerts and notifications to enlisted subscribers. It provides true Omni channel customer experience via state of the art dashboards
Recently my team and I were discussing how to help our clients unlock the value of their industrial and manufacturing data. These industries often wrestle with terabytes of time-series data from a myriad of sensors, machines, processes and external sources. Each data source could potentially include different features with various formats, have non-rationalized time intervals and be generated from various proprietary technologies. The challenge of making this data available on a platform where workers can exploit the data and discover its hidden value can be overwhelming. Here is where the vision of AI Democratization often hits its first significant roadblock.

Managers who are tempted to feel elated at the prospect of having access to all the data they could ever wish for, soon realize it is a data sword of Damocles* hanging by a thread, ready to snap and bury them if they can’t find a way to unlock its value.
Richard Westall’s Sword of Damocles, 1812
* The parable of the sword of Damocles teaches us that no matter how good someone’s life may appear, it’s difficult to be happy living under existential threat.
The solution landscape

The backbone of TCG Digital’s solution is built on AWS infrastructure. Video feeds from gate cameras at airports are captured using AWS IoT Core and published onto Kinesis Video Stream. The Orchestrator running on ECS Fargate consumes the videos and uses a pre-trained inference model running on an EC2 instance to generate turnaround events. It then publishes those events onto a Kinesis Data Stream. A Lambda function consumes these events and mutates them to an AppSync API to be displayed on the turnaround dashboard. A rules engine built using Step Function analyses the events and raises alerts in case of any potential delays.

In conclusion, TCG Digital’s TAT optimisation solution is a game-changer for airlines looking to improve efficiency, reduce costs, and enhance the passenger experience. By reducing turnaround time, airlines can increase revenue opportunities, operate more efficiently, and provide a more seamless travel experience for their passengers.

The solution backbone

Improving transported animal welfare for a North American airline

Improving transported animal welfare
for a North American airline

Introduction

Pets are an integral part of many families, and for pet owners, their furry friends are like family members. When it comes to air travel, it’s essential to ensure that pets are well taken care of and that their safety is a top priority. Ensuring animal welfare posed a significant challenge for a large North American carrier seeking real-time visibility into animal scanning compliance, from pet’s acceptance to customer delivery.

That’s where TCG Digital’s Animal Wellness Initiative came in, providing a comprehensive solution that improved the efficiency and effectiveness of the carrier’s cargo operations.

The TCG Digital team’s solution was a game-changer, providing the carrier with real-time visibility of pets from the time they were accepted to customer delivery. The Animal Wellness Initiative included an intelligent dashboard with automated rules for accountability when handling off pets, historical reporting for scanning compliance, and revenue impact-handling costs. With rule-based solutions to aid visual checks within and outside scheduled alerts, the carrier could maintain an audit trail of activities and operationalize scanning devices to handle pets during irregular operations.
Recently my team and I were discussing how to help our clients unlock the value of their industrial and manufacturing data. These industries often wrestle with terabytes of time-series data from a myriad of sensors, machines, processes and external sources. Each data source could potentially include different features with various formats, have non-rationalized time intervals and be generated from various proprietary technologies. The challenge of making this data available on a platform where workers can exploit the data and discover its hidden value can be overwhelming. Here is where the vision of AI Democratization often hits its first significant roadblock.

Managers who are tempted to feel elated at the prospect of having access to all the data they could ever wish for, soon realize it is a data sword of Damocles* hanging by a thread, ready to snap and bury them if they can’t find a way to unlock its value.
Richard Westall’s Sword of Damocles, 1812
* The parable of the sword of Damocles teaches us that no matter how good someone’s life may appear, it’s difficult to be happy living under existential threat.
The solution was cloud-enabled and included an Omni-channel system for alerts and on-demand reporting. By implementing TCG Digital’s solution, the carrier was able to minimize animal transportation incidents and associated handling expenses, all while improving customer satisfaction, making them more likely to choose the airline for future travel.

To implement the Animal Wellness Initiative, TCG Digital utilized tcg mcube Reporting Accelerator Framework, AWS – Lambda, AppSync, Kinesis, DynamoDB, Elasticsearch, Step Functions, CloudWatch, EKS, Docker, Microservice, Java, NodeJS, Ionic, Amplify, Android, iOS, and Angular 7. The consulting engagement allowed for a UX/UI design that detailed functionality and technical architecture/design, ensuring a seamless and efficient implementation process.
Overall, TCG Digital’s Animal Wellness Initiative was a success, providing a comprehensive solution that improved the carrier’s cargo operations, reduced costs, and most importantly, ensured the safety and welfare of pets during air travel.

Cargo Dashboard for airlines – An implementation approach

Cargo Dashboard for airlines
– An implementation approach

Introduction

A large US-based airline was struggling to keep track of their cargo items on a daily basis. The lack of real-time tracking had led to missed critical time thresholds, causing revenue loss for the airline and damaging their brand value. The ramp agents were forced to operate on a "what you see is what you get" basis, relying only on information from the departure staging guide or the items present at the gates. The airline was in dire need of a solution that could provide real-time status updates for their cargo items.

To address this complex issue, TCG Digital followed a business-first, design thinking-based approach. The team conducted extensive field studies across 7 hub airports in the US, spending considerable time with ramp personnel and supervisors in the cargo department. They then proceeded with ideation, prototyping, and testing for a month to design a customer experience and lay new business processes. The team prioritized and customized requirements, rapidly built and deployed the final product with the most relevant features, and continually fine-tuned the solution through a feedback-based system.

The result? The Cargo Dashboard, a software product with a plethora of features that revolutionized the cargo department’s operations with the following features:
  • Scanner apps to scan cargo items and maintain scan history
  • Configurable scan points to let the business decide touch points across which they want cargo tracking
  • Real-time monitoring
  • Real-time alerts
  • Rule-based engine
  • Visual tracking
To ensure the product’s success, TCG Digital designed a robust system with the following architectural goals:
  • Use responsive design for unified desktop and mobile experience
  • Create, expose or consume micro-service based APIs
  • Create real-time alerts in the application for user notifications
Recently my team and I were discussing how to help our clients unlock the value of their industrial and manufacturing data. These industries often wrestle with terabytes of time-series data from a myriad of sensors, machines, processes and external sources. Each data source could potentially include different features with various formats, have non-rationalized time intervals and be generated from various proprietary technologies. The challenge of making this data available on a platform where workers can exploit the data and discover its hidden value can be overwhelming. Here is where the vision of AI Democratization often hits its first significant roadblock.

Managers who are tempted to feel elated at the prospect of having access to all the data they could ever wish for, soon realize it is a data sword of Damocles* hanging by a thread, ready to snap and bury them if they can’t find a way to unlock its value.
Richard Westall’s Sword of Damocles, 1812
* The parable of the sword of Damocles teaches us that no matter how good someone’s life may appear, it’s difficult to be happy living under existential threat.

The architecture followed a loosely coupled approach, reducing single points of failure. Responsive visual design was used, and data was protected during both at rest and on transit by using modern encryption methods, such as HTTPS for data in transit and KMS for data at rest. Best practices of AWS cloud-specific principles were followed, including high availability of the solution with multi-availability zones and caching of resources where possible for a reduced load on computational resources.

Cargo Dashboard Architecture

Reference architecture using AppSync and EKS for creating a real time dashboard
  1. A corporate user of the dashboard logs in to the application. Cognito User Pool is used to present a login screen to the user where the user enters login credentials.
  2. Cognito connects with corporate Identity provider to authenticate users. The federation is achieved using a pre- configured trust relationship based on SAML assertions. On successful authentication, Cognito uses a callback Route 53 URL to redirect users to the dashboard landing page.
  3. Route 53 routes the request to an ingress load balancer deployed within a private subnet. The load balancer sits on top of a set of microservices deployed onto an EKS cluster. The front end is also containerized and deployed on EKS. S3 and Cloudfront were considered as an alternative for the front end. However, Cloudfront was not a whitelisted service at the organization and couldn’t be used.
  4. To establish a secure connection between AWS cloud and the corporate datacenter Direct Connect is used. An IPsec VPN tunnel is also configured and BGP routing is used to route traffic between cloud and datacenter.
  5. The microservices connect to corporate cargo services to fetch data for the dashboard. The corporate services essentially act as the backend for the microservices.
  6. The corporate cargo services generate some alerts to be displayed on the dashboard. These alert messages are pushed to a managed Kafka cluster on the cloud.
  7. A Lambda function is triggered that reads alert messages from Kafka and mutates those onto an AppSync API. AppSync uses DynamoDB as the backing datastore.
  8. The dashboard application subscribes to the AppSync API. After each mutation, the dashboard receives the new alert message. If a message is deleted from a dashboard the deletion gets mutated and is reflected in all open dashboards across devices. AppSync also takes care of offline synchronization.
The benefits of the Cargo Dashboard were enormous! The ramp agents were now able to make quick and easy decisions for cargo items based on real-time data, which increased their efficiency appreciably. The number of touch points and scan locations, along with alerts, increased overall responsibility and visibility for the agents, leading to clear target-oriented performance. Delays and disruptions in cargo handling dipped substantially, leading to cost savings and an uplift in overall brand value for the airline.

In conclusion, TCG Digital’s business-first, design thinking-based approach enabled them to build a future-proof solution that met the client’s needs. The Cargo dashboard proved to be highly beneficial for the airline as it led to significant improvements in the efficiency and effectiveness of their cargo operations, enabling them to reach new heights.

Cloud Migration & Modernization

Cloud Migration & Modernization

Introduction

Running your business in the cloud has many benefits, such as becoming more agile, the ability to go global quickly and significant cost savings. With so much to offer, it is apparent why leading organizations are looking to leverage the benefits of cloud computing. Not only are firms migrating applications and datacenters to the cloud, they are taking advantage of leading capabilities offered by cloud providers. Business and IT users benefit in myriad ways, such as fast virtual desktops, advanced AI and machine learning analytics, automated data backup and rapid disaster recovery.

However, some organizations are hesitant to begin a cloud adoption journey due to perceived challenges and roadblocks. These first steps are made easier if a cloud migration follows well established strategies. Cloud migration and modernization should be viewed as a continuous process that requires change management spanning people, process and technology. Taking a comprehensive approach will not only help you successfully navigate the journey, but ensure that you realize the intended benefits of being more agile, having the ability to scale, and operational efficiency.

Unlocking Value from Industrial Data

Unlocking Value from Industrial Data

Introduction

Recently my team and I were discussing how to help our clients unlock the value of their industrial and manufacturing data. These industries often wrestle with terabytes of time-series data from a myriad of sensors, machines, processes and external sources. Each data source could potentially include different features with various formats, have non-rationalized time intervals and be generated from various proprietary technologies. The challenge of making this data available on a platform where workers can exploit the data and discover its hidden value can be overwhelming. Here is where the vision of AI Democratization often hits its first significant roadblock.

Managers who are tempted to feel elated at the prospect of having access to all the data they could ever wish for, soon realize it is a data sword of Damocles* hanging by a thread, ready to snap and bury them if they can’t find a way to unlock its value.
Richard Westall’s Sword of Damocles, 1812
* The parable of the sword of Damocles teaches us that no matter how good someone’s life may appear, it’s difficult to be happy living under existential threat.
Recently my team and I were discussing how to help our clients unlock the value of their industrial and manufacturing data. These industries often wrestle with terabytes of time-series data from a myriad of sensors, machines, processes and external sources. Each data source could potentially include different features with various formats, have non-rationalized time intervals and be generated from various proprietary technologies. The challenge of making this data available on a platform where workers can exploit the data and discover its hidden value can be overwhelming. Here is where the vision of AI Democratization often hits its first significant roadblock.

Managers who are tempted to feel elated at the prospect of having access to all the data they could ever wish for, soon realize it is a data sword of Damocles* hanging by a thread, ready to snap and bury them if they can’t find a way to unlock its value.
Richard Westall’s Sword of Damocles, 1812
* The parable of the sword of Damocles teaches us that no matter how good someone’s life may appear, it’s difficult to be happy living under existential threat.

The Modern Historian

The Data Historian manages data from cradle to grave (assuming your data ever dies!)
This brings me to one of my favorite tech journalists, Rob O’Regan. He tackles some of these issues in an article written for CIO in May 2021. Specifically Rob makes the case for modernization of data/operational historians. These platforms must modernize: shift from collecting, dumping and dashboarding data for reporting on what has happened and perhaps what is happening now, into cloud-ready, end-to-end platforms that can use AI/ML to predict what will happen next. These platforms must be available to many personas in the organization and provide data in a way that presents a low-impedance environment for people to get what they need, quickly.
Features of a modernized historian
  • Cloud-base
  • Scale up/down to any workload requirements
  • Real-time data ingest
  • Integrated and standardize data from any industrialized data source
  • Feature stores
  • Versioning
  • Low Code/No Code programming
  • Advanced Analytics, AI/ML

MLOps No Longer an Option

Having said all of the above, little is accomplished if these advanced AI/ML platforms are not managed with MLOps best practices. The entire end-to-end data pipeline, from raw data intake to actionable insights, must itself be industrialized. MLOps ensures your ML pipeline can create, train, evaluate, validate, deploy and monitor the quality of ML models and update them when performance starts to degrade.

 

Data scientists require a laboratory for experiments which can then be automatically deployed into your ML pipeline for full lifecycle management. Everybody knows that Henry Ford didn’t make automobiles – he made the assembly line, which made automobiles. Your ML pipeline is the product, which if done well, will generate endless business value.

MLOps is the industrialization of data science
Benefits of MLOps Best Practices
Optimize Operations and Reduce Cost Increase Quality and Model Effectiveness
Automation eliminates manual practices Capture knowledge and expertise
Continuous Integration/Delivery/Training Version everything for repeatability
Reduce time-to-value Transparency for regulatory/ethical compliance
Eliminate technical debt Continuously monitor model performance
Onboard new employees quickly Manage failures in production

Final Thoughts

Everyone knows that there is value in data and that leveraging that value means a more successful business. The problem is very few organizations understand the way to derive value from data. It all starts with the data, skilled people and an analytic platform. These elements together make for an AI Factory. It is best to find a partner who has both the skilled consultants to show the way and a fully functional platform to provide the data historian and MLOps people with the technology they need to succeed. MLOps is no longer an option on this journey. Of course I would say that. I work for just such a company.

Check us out at www.tcgdigital.com.

About tcg mcube

tcg mcube is an advanced analytics and AI Platform, which allows users to create compelling business solutions for tackling complex industry problems. With its modular architecture, tcg mcube handles many data sources, and it provides an efficient “cut-and-fit” into legacy environments if needed. Its mantra: Velocity to Value.

 

  • Ingest structured and unstructured data from diverse sources
  • Store ingested data within big data stores and data lakes
  • Provide a library of algorithms for ML and AI
  • Create stunning visualizations using a powerful BI library
  • Empower the modern Data Historian
  • Enable MLOps to optimize data value

The Colonial Pipeline Cyberattack: Explained

The recent cyberattack on the Colonial Pipeline Company in the United States has adequately demonstrated the significance of cybersecurity, where the slightest lapse in digitally securing the organization can prove to be the Achilles heel. Through this example of a digital attack, we notice the real-world consequences that impact physical assets, livelihood, and the economy. In this blog, we shall explore some cybersecurity measures on how to combat cyberattacks like the one on Colonial Pipeline.

 

The Giant Company

The Colonial Pipeline Company, headquartered in Georgia, operates the largest pipeline system for refined oil products in the U.S. This network of pipelines extends between New York and Texas, is 5500 miles long, and provides almost 45% of the East Coast’s fuel.

 

What had happened? The Incident that Brought the Company to a Grinding Halt

In May, Colonial Pipeline came under a cyberattack and all IT operations had to be immediately stopped. The incident surfaced as a ransomware outbreak organized by a cybercriminal hacking group called DarkSide. The malware they launched jeopardized the company’s necessary technology and equipment, which managed the pipelines.

 

Who all were affected? The Ramifications

During this time period, the entire service of fuel supply was paralyzed from Texas to New York. It invariably led to a shortage of available gas, which in turn, increased the gas-rate as well.

 

How did it exactly happen? The Way of Cybercriminals

The cybercriminals got hold of “Colonial Pipeline” through an inactive Virtual Private Network (VPN). Though, not in use, the VPN had been accepted by the Colonial Server and remained associated. Credentials of the cybercriminals were found from the dark web and they also claimed double extortion.

This meant that, in addition to the ransomware for encrypts of data, the cyberattackers had also stolen nearly 100 GB of critical data. There is sufficient evidence to conclude the hackers’ movement around the IT network.

While FBI was investigating, DarkSide, the cybercriminal group demanded 75 BTC ($4.4 million) for releasing the compromised data it had exfiltrated by using the ransomware.

After the payment, the Department of Justice (DOJ) USA stated that around 64 BTC was recovered from the virtual wallet, which the cybercriminals had used to collect the payment from the victim.

Nowadays, hackers use a new perspective called double extortion. It means they will not only encrypt the data but will also sell it.

 

The Consequence – DarkSide lost access to Infrastructure

The virtual wallet was seized from DarkSide and FBI recovered funds along with other US government facilities. It put out an online statement in a forum that “Services were ceased (country not named); money of advertisers and founders was transferred to an unknown account”. The hacker group also claimed that they had released the decryption tools to all the companies they attempted to extort data from, but were yet to receive payments.

 

Mitigations – How to Shield against Cyberattacks?
The number of cyberattacks through ransomware extortion in America’s energy infrastructure, oil and gas and power sector has increased tremendously. Such attacks are adversely affecting the oil and gas production along with the other supply chains across the globe.

Some measures to prevent cyberattacks:
  • Backups are important: Use backup systems and create multiple copies. They also need to be tested for infected files.
  • Disable Macro: If the Macro is not disabled then Microsoft files, transferred within the emails, may contain malicious scripts.
  • Endpoint Protections and Antivirus: Performing regular system scans and updating the antivirus signatures are critical. The EDRs should be configured and updated as per the latest rules and policies.
  • System Patches: Devices, including applications and cloud management systems should be patched and updated regularly. If possible, a centralized patch management system should be used.
  • Internet Access Restrictions: The key point of a ransomware’s entry consists of social network websites and personal mails. Creating limitations on access and imposing restrictions can be beneficial.
  • Monitor Third party: Continuous monitoring of network and activities regarding all third-party involvements are needed.
  • Restricted Policies: The key targets of Ransomwares are directories like Temp. Hence, such directories, in addition to on-memory execution should be blocked.

 

Responding to Ransomware – How to Treat Compromised Systems?
  • Isolate the infected system from the network to contain the malware and prevent it from spreading.
  • Find out if a decryptor is available or not. There are plenty of online resources available like https://www.nomoreransom.org/
  • Restore files from the backups held earlier


Thus, it can be noticed that while precaution is better than cure; in certain critical instances like cyberattacks, precaution is the only cure!

The Nuances of Ransomware Protection Unveiled

What is a Ransomware?
Ransomware is a kind of malware that pressurizes a victim to pay ransom, and unless and until the ransom fee is paid, the data of the victim remains encrypted or blocked from his access. Within the context of this attack, an entire organization is left paralyzed.
How they spread?
Ransomware spreads through phishing emails which contain malicious attachments or through software which a user downloads unknowingly. The attachment can be in any format such as ZIP files, Word documents, Excel spreadsheets, and even more. A ransomware attack can also take place when a user visits a website he is not aware of.
WannaCry-a ransomware
WannaCry ransomware (also known as Wana Decrypt0r, WCry, WannaCrypt) came to notice in the year 2017. A massive attack across multiple countries took place on 12 May 2017. Multiple reports mention that a total of 300,000+ systems were affected in over 150 countries. The ransomware attack affected multiple industries including healthcare, government, telecommunications, and petrochemicals. The true nature of this malware is that it comes under a worm classification which means that it can replicate itself in order to spread within an entire network. Researchers also mentioned that WannaCry has an encryption component that is based on public-key cryptography.
How WannaCry infects the systems?
The malware uses the famous Eternal Blue and DoublePulser exploits that were developed by NSA (National Security Agency). These exploits were leaked in 2017 by the Shadow Brokers group. The Eternal blue exploits the SMB v1 vulnerability, which Microsoft patched on 14th March 2017 and added to the security bulletin-MS17-010. The following vulnerability allows an attacker remote code execution by sending crafted messages and connects to TCP ports 139 and 445 of unpatched Windows system. Once the Eternal Blue exploit is executed, WannaCry tries to insert the DoublePulser through the system backdoor.
Prevention
  • Don’t click on suspicious links: If you feel that the link within an email is not relevant then it’s best to avoid clicking on it. Sometimes when a user interacts with the link, say by clicking on it, the malware starts downloading itself.
  • Information related to you: Let’s say you receive a call, message from an unknown contact, do not respond to them. The attacker may have a phishing attack in mind and sharing personal information would make you vulnerable.
  • USB drives: If an unknown person (untrusted source) hands over a USB drive to you, never plug-in the device into your system. This device may contain auto-run scripts which can be executed once you connect the drive to the system.
  • Updating your Operating System: Keep your programs and operating system updated as the vendors provide regular patches, which are sometimes critical security updates. If you abide by this practice attackers will find it difficult to exploit your system or network.
  • Using VPN: We as daily users connect to public networks (Wi-Fi) in coffee shops, subways, railway stations, and restaurants. Nevertheless, we do not realize that connecting to a public network makes us vulnerable to cyber-attacks. Therefore, it is a good practice that we use VPNs while we are in a public place. Another word of caution though; until very essential, do not use public networks.
  • Downloading Software: Ideally users should download software from a known source that they can rely on; meaning files from an unknown source may contain ransomware.
  • Application whitelisting: Configure operating systems or use third-party software to only allow authorized applications to run on computers, thus preventing ransomware from working.
  • Awareness: For organizations, it will prove beneficial if they educate their employees on practices and programs based on ransomware protection along with countermeasures for an attack.
If remote services are not required, it is advised that an individual turns them off. This will make the system more secure. Hence, attackers will not be able to execute remote exploits. For organizations, plenty of endpoint protection software are available, which protect against ransomware. Apply content filter for your mail servers and block websites that are malicious. There are firewall rules that can protect you against ransomware and these configurations should be applied.
Backup
Additionally, within Windows, a feature is available for data backup, and it is a good practice that users keep a backup of their data. If NAS (Network Attached Storage) is configured within a network, it can also be a target of ransomware. This is one reason why you should create regular back-ups of your data and save multiple copies.
Conclusion
To avoid becoming a victim of ransomware, we urge you to adopt the necessary precautionary measures and block future attacks. In case you are still a victim of ransomware we encourage you to take necessary action by referring to the above protective measures.

How Emerging Technologies are reshaping the Future of Retail?

While the impact of COVID-19 is being felt all over the world, retail is among those sectors which have been hardest hit. It is fraught with low supply and demand, dysfunctional supply chain, and an increasing reliance on customers. As retailers continue to struggle, few critical trends have emerged which range from changed product mixes to complex consumer-retailer interactions defined by safety, and growing demand for convenience. This has led businesses to quickly adopt new technologies as they navigate unchartered territories, and try to stay relevant and profitable.

AI is revolutionizing retail
It would not be wrong to state that now is the time for brands to think digital. In their effort to meet changing customer requirements, it is technologies like artificial intelligence (AI) and machine learning (ML) which will help them stay afloat. As compared to traditional analytics, they come with an entirely new level of data processing capabilities which leads to more valuable insights. At present 28% of retailers already deploy AI/ML solutions which equal to a seven-fold increase from 2016 (Source: SPD Group). By adopting AI, retailers can glean valuable insights from customer intelligence and behavioral data and make informed decisions. By capitalizing on AI, specifically on ML retailers are providing immersive experiences to captivate customers, synchronizing offline and online channels for seamless service delivery, and redefining traditional supply chains on the lines of flexible systems which accurately respond to shifting consumer mindsets.

Demand forecasting
Amidst all these changes, retailers are faced with one critical question, which is, how will the pandemic impact demand and how can they accurately predict the shift? Many retailers are turning to ML-powered demand forecasting to adapt to today’s reality. In contrast to traditional forecasting methods, this new approach is more adaptable to changes and can be implemented faster. ML enables systems to learn automatically and improve recommendations with data alone, without relying on human intervention for additional programming. As retailers generate large amounts of data, ML technology can often deliver significant business value. When data is fed into an ML system, it searches for patterns and uses them for better decision-making. In most cases, ML makes it possible to incorporate multiple factors and correlations which impact demand into the retail forecasts. And, it is possible to enhance the accuracy of demand forecasting by optimizing the systems with POS data, NLP models, and recent data from external sources.

It has been seen that NLP models are often used to analyze social media and news, which in turn helps determine customer sentiment. Text mining and sentiment analysis enable retailers to closely monitor the comments which customers share on social media. This gives them an idea of what customers buy most often, their feedback on the product availability, and their changing preferences. With adequate samples of conversations which customers engage in, NLP models can predict goods which are running out of stock and need to be replenished. They can also detect slight changes in the purchase patterns, thereby improving the accuracy of demand forecasting.

Pricing Optimization
Besides demand forecasting, pricing is another area where ML comes of use. With the dramatic shift to online channels, the pricing strategy and competitive pricing have significant implications for retailers. Those who believe in setting prices based on conventional metrics may fall behind. Machine Learning algorithms can decipher unconventional relationships between multiple parameters which provide valuable insights and help set the optimal price. Specifically, ML helps retailers determine the price elasticity, ie the impact of a price change on a product’s demand. This capability plays a vital role in promotion forecasting and in optimizing markdown prices, especially when retailers need to clear out stocks. However, price elasticity alone might not capture the complete impact of a price change. The prices of alternate products within the same category often have a significant impact as well. Here advanced ML algorithms can be deployed and a product’s price position determined in a straightforward manner.

Supply chain management
Apart from forecasting the demand and determining the best price for a product and service, managing the supply chain effectively is equally crucial. Supply chains have always been vulnerable to natural disasters, disruptions and geopolitical issues. But the upheaval caused by the COVID-19 pandemic has been one of a kind. It has led businesses to take a close look at their supply chain management strategies, both for sustainability and growth. It has been seen that predicting the future demand for production remains one critical challenge in supply chain management. Here machine learning takes into account multiple factors which cannot be tracked through existing methods which retailers rely on. ML is used by businesses to conduct an in-depth analysis of individual customers and predict their future buying behavior. This helps tailor the production and transport processes to the actual demand, thereby enabling businesses to deliver value and build a relationship of trust with their customers.

On a concluding note
AI and specifically machine learning will impact almost every aspect of retail operating models. While traditional retailers have hesitated to wade into this technology, they cannot wait further. On the other hand, their forward-thinking counterparts who have already embarked on an AI-enabled future are more poised to gain an edge and leapfrog their competition. While there may not be a one-size-fits-all solution, the test-and-learn attitude will help them realize the full potential of emerging technologies.

Recalibrating Customer Experience in the New Normal

Dale Carnegie rightly put it when he said, “When dealing with people, remember you are not dealing with creatures of logic, but creatures of emotion.” These words of the famous American motivational speaker and writer have become even more significant now as businesses try and adapt to changing customer behavior with care and empathy. Within a very brief span of time, COVID-19 has gravely affected the lives and livelihoods of millions across the globe. And, this has led organizations to rethink the definition of customer engagement and service, prompting them to closely monitor customer satisfaction metrics to be able to address their needs accurately.


Understanding customer needs

According to a report by Oracle, 86% of buyers 1 are willing to pay more for a great customer experience. But one daunting question most businesses are faced with is how to approach customer experience (CX) differently in the new normal. Before they can create an engaging customer experience, the first step is to understand what matters most to customers – especially during this disruptive scenario. A Salesforce report suggests that 73% of customers 2 expect organizations to understand their needs and expectations. One of the best ways to gauge their requirements is to intimately understand them – know their persona well. Often, defining the target audience, delving into their demographic profiles and digital footprints can provide insights into their needs and wants.


Role of AI

Businesses are also fast embracing AI technologies such as machine learning to understand the customers accurately. The technologies are being used to gather and analyze historical, social, and behavioral data and gain a better understanding of the customers. AI which continuously improves from the data which it analyzes can accurately anticipate customer behavior. It delivers actionable and real-time customer insights which equip brands to create relevant and thoughtful content across touch points that not only resonate with customers but also improve the chances of sales opportunities and enhance the customer journey.


Need for differentiation 

While understanding the customer is important, equally crucial is to differentiate from the competition. If you want to stand out as a brand, you must personalize the customer experience. Close to 80% of customers 3 are more likely to make a purchase when businesses provide personalized experiences. AI helps brands to connect with customers at a more personal level by analyzing customer sentiment and feedback with precision which is not achievable by humans alone. It has also been seen that feedback-driven models foster higher engagement levels, loyalty, and retention.


Why strategize? 

However, before looking into AI, what brands need is a CX vision and a robust strategy. While laying down a well-thought-out strategy, brands need to factor in the role of safety and trust. The pandemic has brought forth consumers who are more thoughtful in their decision making and selective when with purchases. They are engaging with brands which demonstrate safety, convenience and trust in experience delivery. Most consumers trust a brand which is good at protecting the privacy of their personal information, delivers quality products and services at fair prices and engages with them in a meaningful way. This is leading forward-thinking brands towards process and business–model innovation at every point of interaction with customers.


Focus on efficiency

Along with this, well-performing brands are prioritizing efficiency. They have realized that to achieve a superior customer experience which is also sustainable; they need to focus on efficiency and convenience. After the onset of the pandemic, customers are no longer willing to understand and tolerate inconvenience despite their loyalty to a trusted company. Companies that opt for the efficiency path to win profitable customers have few factors in common. Most of these companies have access to the data they need and streamline workflows to align with customer journeys. This helps overcome organizational silos and eliminate the habitual repetitive tasks. Moreover, employees working with these brands are empowered to connect the dots between specific customer expectations and interactions. They are prompt in adopting technology to automate routine tasks and deliver consistent experiences across touch-points. Finally, these brands remain focused on supply chain innovation and optimization for seamless service delivery.


Adopting omni-channel strategies

In the wake of the pandemic, there is an increased need for omni-channel strategies. And one among them is to boost customer loyalty through several touch-points. By providing customers with multiple opportunities to connect, brands are driving positive customer experiences and loyalty. As consumer buying habits have changed, with a majority opting for online purchases and ordering in bulk, organizations need to foresee spikes in demand for which they are relying on technology. By investing in technology, they can expect improved inventory management and order routing along with gaining an in-depth knowledge of consumer behavior and preferences.

Omni-channel transformation is the need of the hour where touch-points are not treated in isolation but as part of seamless transitions when customers move from one channel to another. Customer experience journeys cannot be treated as linear but one where there are frequent shifts between traditional and digital channels, which can vary according to customer types and preferences. Understanding the customers’ digital behavior goes a long way in reducing the churn rate. Often the best approach is to promote automated tools and self-service touch-less technology for basic interaction with tech-savvy consumers and appoint highly skilled live agents to handle critical requests.


Final note

From the above discussion we can conclude that the power to differentiate, understand customer needs at a granular level, maintain a focus on efficiency and adopt omni channel strategies are the key factors which can drive an enhanced CX. And this investment in building relationships and delivering a superior experience can help brands to retain valuable customers in the long run. Over and above these, it is critical to understand and prepare for major changes as evolving consumer habits often necessitate a change in the CX strategy not just for sustainability but also for a competitive advantage— the importance of which cannot be overlooked in these uncertain times.