Devin Partida, Author at Datamation https://www.datamation.com/author/dpartida/ Emerging Enterprise Tech Analysis and Products Tue, 10 Oct 2023 19:24:17 +0000 en-US hourly 1 https://wordpress.org/?v=6.3 Top 7 Challenges of Big Data and Solutions for 2023 https://www.datamation.com/big-data/big-data-challenges/ Wed, 04 Oct 2023 19:50:00 +0000 http://datamation.com/2017/06/05/big-data-challenges/ Big data can be a revolutionary tool for businesses across all industries, but like all tools, its effectiveness depends on how well it is used—and big data has proven particularly difficult for many organizations to wield. To remain competitive in an increasingly data-centric landscape, businesses must learn how to capitalize on big data’s potential. This article looks at the challenges of big data and explores why so many big data projects fall short of expectations. It also presents the seven most common obstacles faced by enterprises and offers a roadmap to overcome them and make the most of big data.

What Is Big Data?

Big data is more than just information in large quantities—more specifically, it’s data too large and complex to manage or process with conventional methods. Processing even a fraction of the millions of terabytes of data generated daily takes considerable computing power and storage capacity. It also takes data quality, data management, and data analytics expertise to maintain all that data and unlock its potential.

Even a minor amount of data can be helpful to businesses that know how to use it to learn more about customer behavior, product performance, and market trends, for example—but small volumes of data also provide limited reliability. Just as a larger sample size ensures scientific experiments are more representative of the real world, big data provides a better look into actual events and trends.

The Big Data “3 V’s”

The “big” in big data covers three primary categories, known as the Three V’s—volume, velocity, and variety:

  • Volume. This is the most straightforward of the three, as big data naturally involves huge amounts of data. The sheer scale of information in these datasets renders conventional storage and management systems effectively useless.
  • Velocity. Big data is also big in its velocity, or how fast new information is gathered and processed. Processing must be rapid to keep up with the pace of information.
  • Variety. information in these data sets comes in multiple formats from numerous sources—industrial devices, social media channels, emails, for example—and can include text, sales data, videos, pictures, or sensor information, to name just a few. This rich variety provides a more complete picture of what the business wants to understand.

These three dimensions provide a useful way to think about big data and the challenges of working with it. It involves unthinkably huge amounts of data coming in like a firehose at blistering speeds in too many shapes and sizes to easily manage.

Challenges of Big Data

This volume, velocity, and variety of data can push businesses further than ever before, but the majority of big data projects fail. Here are seven of the most common reasons why, and solutions to help overcome these obstacles.

1. Cybersecurity and Privacy

Security is one of the most significant risks of big data. Cybercriminals are more likely to target businesses that store sensitive information, and each data breach can cost time, money, and reputation. Similarly, privacy laws like the European Union’s General Data Protection Regulation (GDPR) make collecting vast amounts of data while upholding user privacy standards difficult.

Visibility is the first step to both security and privacy. You must know what you collect, where you store it, and how you use it in order to know how to protect it and comply with privacy laws. Businesses must create a data map and perform regular audits to inform security and privacy changes and ensure that records are up to date.

Automation can help. Artificial intelligence (AI) tools can continuously monitor datasets and their connections to detect and contain suspicious activity before alerting security professionals. Similarly, AI and robotic process automation can automate compliance by comparing data practices to applicable regulations and highlighting areas for improvement.

2. Data Quality

Data quality—the accuracy, relevance, and completeness of the data—is another common pain point. Human decision-making and machine learning require ample and reliable data, but larger datasets are more likely to contain inaccuracies, incomplete records, errors, and duplicates. Not correcting quality issues leads to ill-informed decisions and lost revenue.

Before analyzing big data, it must be run through automated cleansing tools that check for and correct duplicates, anomalies, missing information, and other errors. Setting specific data quality standards and measuring these benchmarks regularly will also help by highlighting where data collection and cleansing techniques must change.

3. Integration and Data Silos

Big data’s variety helps fill some quality gaps, but it also introduces integration issues. Compiling multiple file types from various sources into a single point of access can be difficult with conventional tools. Data often ends up in silos, which are easier to manage but limit visibility, limiting security and accuracy.

Cloud storage and management tools let you shift information between databases to consolidate them without lengthy, expensive transfer processes. Virtualization can also make integration easier—data virtualization tools let you access and view information from across sources without moving it, which increases visibility despite big data’s volume and velocity.

4. Data Storage

Storing big data can be a challenge—and a costly one. Businesses spent $21.5 billion on computing and storage infrastructure in the first quarter of 2023 alone, and finding room to store big data’s rapidly increasing volumes at its rising velocity with conventional means is challenging, slow, and expensive.

Moving away from on-premise storage in favor of the cloud can help—pay for what you use and scale up or down in an instant, removing historical barriers to big data management while minimizing costs. But the cloud alone won’t be sufficient to keep pace. Compression, deduplication, and automated data lifecycle management can help minimize storage needs, and better organization—also enabled by automation—allows faster access and can reveal duplicates or outdated information more readily.

Read our 2023 Cloud Computing Cost: Comparison and Pricing Guide.

5. Lack of Experience

Technical issues may be the easiest challenges to recognize, but user-side challenges deserve attention too—and one of the biggest is a lack of big data experience. Making sense of big data and managing its supporting infrastructure requires a skillset lacking in many organizations. There’s a nationwide shortage of jobseekers with the skills being sought by enterprises, and it’s not getting any better.

One solution? Rather than focusing on outside hires, foster data talent from within existing workforces. Offer professional development opportunities that pay employees to go through data science education programs. Another is to look for low-code or no-code analytics solutions that don’t require skilled programmers—similarly, off-the-shelf software and open source big data solutions are more common than ever, making it easier to embrace big data without extensive experience.

6. Data Interpretation and Analysis

It’s easy to forget that big data is a resource, not a solution—you must know how to interpret and apply the information for it to be worth the cost and complexity. Given the sheer size of these datasets, analysis can be time consuming and tricky to get right with conventional approaches.

AI is the key here. Big data is too large and varied to analyze quickly and accurately manually. Humans are also likely to miss subtle trends and connections in the sea of information. AI excels at detail-oriented, data-heavy tasks, making it the perfect tool for pulling insights from big data. Of course, AI itself is just a tool and is also prone to error. Use AI analytics as a starting point, then review and refine with human expert analysts to ensure you’re acting on accurate, relevant information.

7. Ethical Issues

Big data also comes with some ethical concerns. Gathering that much information means increased likelihood of personally identifiable information being part of it. In addition to questions about user privacy, biases in data can lead to biased AI that carries human prejudices even further.

To avoid ethical concerns, businesses should form a data ethics committee or at least have a regular ethical review process to review data collection and usage policies and ensure the company doesn’t infringe on people’s privacy. Scrubbing data of identifying factors like race, gender, and sexuality will also help remove bias-prone information from the equation.

While size is one of big data’s strongest assets, consider whether you need all the information you collect—not storing details that don’t serve a specific, value-adding purpose will minimize areas where you may cross ethical lines.

The Bottom Line: Eliminate Challenges to Succeed with Big Data

Big data is a complicated issue. The sheer volume and variety of the data and the speeds at which it collects poses technical challenges to enterprises looking to establish the infrastructure to process, store, and analyze it. The nature of the work also demands expertise that’s not always easy to come by. As a result, most big data projects fail. But the payoffs are also big, and enterprises that approach big data strategically and prevent or overcome common obstacles can capitalize on the promise of big data.

Read The Future of Big Data to learn about the trends shaping this field and how they will affect the way enterprises work moving forward.

]]>
What is Big Data Management? https://www.datamation.com/big-data/big-data-management/ Fri, 15 Sep 2023 21:20:00 +0000 http://datamation.com/2017/06/20/big-data-management/ Big data management refers to the governance, administration, and organization of the enormous volumes of data companies handle throughout its lifecycle, including ingesting, processing, storing, and analyzing it to fuel decision-making and keep operations running smoothly. Because big data management touches on many areas of an enterprise’s work, it takes time and a concentrated effort to create and stick to an effective action plan. This article provides an overview of the different components of big data management, its benefits and challenges, and some of the most common techniques and best practices. It also explores the services and vendors available to help businesses with their big data management efforts.

What is the Importance of Big Data Management?

Big data management concerns how organizations store and handle data. Adherence to best practices can make costs more manageable and ensure businesses have the appropriate infrastructure for retaining information now and in the foreseeable future, making it easier to scale up as needed or maintain the proper security for personal or confidential data.

Done right, big data management ensures that an organization’s data is accessible, well-organized, and accurate. It’s essential for increasing people’s trust in the information they rely on for decision-making. Advanced analytics platforms won’t give reliable results if the data is inaccurate—crafting and enforcing well-defined guidelines for processing and handling data ensures an organization’s data is consistent, accurate, and secure.

A focus on data governance as part of big data management can protect an enterprise by limiting the damage of a security breach or similar problem, and reduce regulatory issues by ensuring compliance with legal or jurisdictional data policies—for example, the European Union’s General Data Protection Regulation (GDPR), which has stipulations that allow people to see the information organizations have about them.

The volume of data organizations collect and store has never been higher, and it’s only growing. Enterprises that don’t have proactive strategies for managing that data will find it difficult to catch up and risk damaging their operations or reputations, and potentially legal or regulatory issues.

Read Big Data Analytics to learn more about how organizations are working with the data they collect.

Big Data Management Challenges

The sheer volume of data presents the single biggest challenge when it comes to big data management. Unstructured data—data like emails, social media content, and multimedia—presents different challenges than structured data like spreadsheets and database records, and in aggregate, managing so much data in so many different formats from so many disparate sources requires strategy.

Organizational silos complicate those efforts, upping the risk of duplicate or hidden information or inconsistencies in how data is collected, formatted, or stored.

Managing big data can become even more daunting if organizations don’t implement scalable plans for handling incoming spikes. For example, many organizations are especially busy during specific types of the year—it’s more challenging to efficiently and effectively use the additional information associated with those periods if leaders haven’t planned for the anticipated surges.

Big Data Management Benefits

Big data management allows businesses to remain competitive and feel confident about the information they use to make critical decisions. It also provides a number of additional benefits—here are the most common.

  • Scalability–data management lets businesses create repeatable processes to increase or decrease systems based on data needs, providing predictability and minimizing costs.
  • Security–effective policies for how data is stored and who can access it can ensure data is backed up, recoverable, and protected from unauthorized access.
  • Accessibility–by maintaining consistent approaches to collecting, formatting, and storing data, organizations make it available to the right people at the right time.
  • Accuracy–big data management can increase trust in data across an organization by ensuring it is accurate and reliable.
  • Compliance–data retention and privacy policies help keep an organization in line with jurisdictional and legal regulations, ensuring compliance and preventing privacy concerns.

4 Big Data Management Best Practices

Big data management is most effective when it follows industry standard best practices. Here are some to strongly consider, regardless of organization type or size.

Know Which Data to Prioritize

The time and money organizations must invest in big data management typically increases alongside information volumes. Businesses should identify the information that is most important and create data retention policies for how—and how long—to retain it as well as what data should be purged to minimize storage costs and reduce time spent searching.

Create Backup and Recovery Strategies

All big data management efforts should include steps to keep data safe from cybersecurity threats, natural disasters, or storage failures. Regular backup and recovery plans are a critical part of any big data management strategy.

Understand the Types of Data You Have

All data falls into one of three categories: structured, unstructured, or semi-structured. Structured data includes numbers or text strings that relational databases can handle, while unstructured data is more varied and can consist of information stored in audio files, images, or videos. Semi-structured data contains characteristics of structured and unstructured data. Most organizations have significantly more unstructured than structured data. Knowing how to store, access, and work with the right kind of data is key.

Establish Data-Handling Processes

Much of an organization’s data will arrive for processing in various formats and from different sources. Proactive processes for evaluating, cleaning, and formatting it can ensure consistency and reduce errors. Making data-handling improvements may involve following steps to standardize the information’s format or screening it for duplication problems.

Bottom Line: Implementing Big Data Management

Big data management is essential for enterprises as the volumes of data-–and the importance of its role in operations—skyrockets. Tackling it can be complex. A number of providers offer services to help with enterprise big data management, from auditing existing processes and making detailed recommendations to outsourcing data management strategies and procedures entirely. They can also help create action plans for moving data between locations, such as from on-premises solutions to the cloud.

Other vendors offer data management tools aimed at taking much of the burden off of enterprises.

Microsoft Azure, Google Cloud, and Amazon Web Services (AWS) all offer products to move, store, and analyze data, and IBM offers artificial intelligence-powered products to facilitate big data management and improve decision-making.

Whether in-house, outsourced, or a hybrid model, all enterprises should be fully engaged in a big data management strategy that protects, improves, and assures the reliability of its most valuable asset.

Read Top 7 Data Analytics Tools and Software to discover the best platforms for working with and visualizing data.

]]>
Pros and Cons of Big Data – Datamation https://www.datamation.com/big-data/big-data-pros-and-cons/ Wed, 06 Sep 2023 21:20:00 +0000 http://datamation.com/2018/08/09/big-data-pros-and-cons/ The term “big data” refers to both structured and unstructured data in a volume and variety too massive in scale and complexity to be managed using traditional methods. Specialized tools are required to manage the data and to find patterns, track trends, and extract other meaningful information to provide the kind of insights on which businesses increasingly rely. This article explores the pros and cons of working with big data and the challenges it presents and looks at some of the top business intelligence tools to manage it.

Jump to:

Pros and Cons of Big Data

Big data can give leaders more decision-making resources and insights. It can make enterprise organizations more competitive and help them tailor their offerings to customers with more confidence. It can build customer engagement and loyalty and feed marketing and pricing decisions.

But the vast quantities of information most businesses collect and accumulate can make data management particularly challenging—especially for those organizations not prepared for the task. There are also concerns about what kind of information businesses collect, and what they choose to do with it.

Whether and how a business decides to incorporate a big data strategy into its overall business intelligence efforts will come down to different factors, including goals, budget, and staff, but there are a number of advantages and disadvantages to be considered.

Big Data Advantages

There are many advantages to working with big data, but most fall within a few main categories. Here’s a high-level overview.

Improved Decision-Making Capabilities

Investing in big data can provide the kind of information business leaders need to make challenging decisions, helping them identify and weigh all the relevant factors that can affect the outcome of their choices. Big data can be used across all departments and in all industries.

Historical data, customer data, and competitive market research, for example, can guide businesses when expanding their products and services, moving into new markets or geographic areas, or making acquisitions.

Better Customer Engagement

Meeting customer needs is challenging, especially as companies expand into new markets or offer new products—different customers have varying priorities and interests based on demographics, regional preferences, and more. Big data can provide clarity to help businesses earn customer loyalty and drive sales.

Enterprises can gather data from social media, sales records, customer feedback surveys, and other sources to learn more about their buyers. In fact, data shows that customers reward those efforts—a 2022 study found that 71 percent of respondents expect brands to understand them but less than half felt understood. The strategic use of big data can close that gap.

Increased Opportunities for Social Good

Big data can also provide insights to spark positive, lasting changes—researchers have used it to identify domestic violence and homelessness trends, for example, which can help nonprofits and other organizations provide more appropriate resources and support.

For enterprises looking to expand their corporate social responsibility efforts or highlight their social impact, big data can help them fine tune their resources based on need, interest, and return on investment.

Big Data Disadvantages

Enterprises looking to work with big data will face a number of challenges. Here are the main disadvantages they should consider.

Cost of Doing Business

Big data can be expensive to work with. It’s not as straightforward as investing in a tool and expecting results—working with big data is complex. It requires investments in storage solutions, analytics tools, and cybersecurity and governance programs.

From data scientists and analysts to storage and cybersecurity experts, it also requires staff expertise. Businesses can expect an initial investment and ongoing costs, which might not realize results for some time.

Privacy Concerns

Knowing more about customers can benefit businesses, but it also raises privacy issues. From rewards programs to apps, businesses can gather enormous amounts of information about their customers, their shopping habits and preferences, even their biometric data and their behavior in online and brick-and-mortar stores.

This information can be used to tailor discounts and promotions, but organizations have to walk a line with privacy. A 2023 Razorfish study found that 21 percent of respondents thought brand personalization was simultaneously great and alarming—it also found that half of all respondents would no longer do business with a brand that shared their information without consent.

Data Quality Issues

Even the most advanced big data platforms and cutting-edge technologies can’t compensate for poor quality information. Duplicate records, inaccurate details, and formatting errors are just a few of the many data quality issues and anomalies that can lead to incorrect conclusions.

As businesses gather more information on an ever-expanding scale from disparate sources and try to make it all actionable, it becomes increasingly difficult to ensure consistent quality across the board. Enterprises need to work with experts to audit and validate data on a regular basis.

Talent and Staffing Needs

Working with big data—and big data tools—oftens requires specialized skills. As larger enterprises expand their own data science initiatives, the market becomes more competitive for skilled professionals and smaller businesses begin to struggle to find experienced staff.

Enterprises seeking experienced professionals to work with big data need to invest in their recruiting efforts by offering competitive salaries and benefits and fund education and development for staff if they want to find the right people for critical roles.

An illustration listing the pros and cons of big data

Challenges of Working With Big Data

What does it take to establish a big data strategy at the enterprise level? It’s a complex implementation that only succeeds if it has buy-in at all levels of the organization, from line staff to leadership. Shifting to data-driven decision making involves infrastructural investments, data analytics and visualization efforts, software selection and implementation, and ongoing training—in short, it requires a cultural shift.

Here are some of the most common challenges businesses encounter when establishing big data programs:

  • Scaling up efforts as needed
  • Identifying relevant data types and locations
  • Keeping customer data secure and honoring privacy
  • Sticking to budgets and timelines
  • Getting staff buy-in on new systems, tools, and ways of doing things
  • Finding trusted vendors and service providers

Vendors and service providers can help, but they can only do so much. Big data is only helpful if it is used for decision making, and that has to happen within the organization itself.

Top BI Tools for Managing Big Data

Business intelligence (BI) tools can facilitate big data efforts by making it easier to manage, analyze, and report on the information to provide the clearest insights. There are a wide range of tools from a wide range of providers on the market, and finding the right one comes down to specific needs. Here are some of the most popular enterprise BI solutions.

Amazon Web Services icon.

Amazon QuickSight

Because so many organizations rely on Amazon Web Services (AWS) for their infrastructure as a service (IaaS), platform as a service (PaaS), and hosted private cloud needs, Amazon QuickSight has a ready customer base—particularly among those that use the cloud service to store their business data.

It also offers a unique, pay-per-session pricing model that means organizations only pay for their use of interactive dashboards. Other key features include scalability, the SPICE in-memory calculation engine, ML Insights, embedded analytics, and a mobile interface.

Cloudera icon.

Cloudera Data Platform

The Cloudera Data Platform is a hybrid tool marketed to businesses with information spread across public and private clouds or on-premise facilities. User-friendly dashboards and enterprise-grade data security and governance features help company representatives use customer information in trustworthy, responsible ways.

Microsoft icon.

Microsoft Power BI

Microsoft’s Power BI is a popular tool that lets users create a single source for all data, promoting easy access and analysis. It includes many options for visualizations to make it easier to work with data across stakeholders and teams.

Power BI gets strong reviews from analysts and is one of the best-selling business intelligence tools. Noteworthy features include advanced data protection and governance capabilities, integration with other Microsoft applications and cloud computing services, self-service analytics, fast data preparation, streaming dashboards and more.

Qlik icon.

Qlik Sense

Qlik’s flagship BI software, Qlik Sense, incorporates an artificial intelligence-based associative analytics engine. Other features include fast performance, insight suggestions, automation, mobility, open APIs, and multi-cloud deployment options.

Qlik’s AI and augmented intelligence capabilities differentiate it from many of the other BI applications.

The company is focused on making analytics accessible to all, and accordingly, its platform is very user-friendly. Multi-cloud support gives enterprises a lot of flexibility in deployment.

Tableau icon.

Tableau

Tableau is an all-purpose data management tool that allows people to prepare, evaluate, and share information. Acquired by Salesforce in 2019, Tableau continues to sell its BI software under its own brand name and includes a wide variety of elements, including Desktop, Browser, Mobile, and Embedded versions. It incorporates data preparation, governance, content discovery, analytics, and collaboration capabilities, and can be deployed in the cloud or on premises.

While the price tag can be high depending upon the implementation, Tableau offers very powerful data visualization capabilities and creates attractive dashboards. It also has an extensive library of online help and active public support forums.

Read Best Business Intelligence Software and Tools to learn more about what differentiates the best BI platforms on the market and they meet your needs.

Bottom Line: Using Big Data for Good

Big data can be a game-changer for enterprises looking to step up their business intelligence programs. When done well, big data can provide insights about customers, fuel data-driven decision-making, and feed many aspects of businesses’ work, from marketing to finance to human resources. But working with big data requires investments in infrastructure and staff, corporate cultural shifts, and an expertise-driven strategy.

It also demands careful attention to privacy rights and security concerns. Companies working with big data need to find a balance in how they use what they know about their customers. Big data can help them improve promotions, better target advertising and marketing campaigns, and highlight products that cater to their preferences—but it can also be used in troubling ways, both by intent and by carelessness.

By being transparent about what data they collect and what they plan to do with it and being explicit about the perks of providing data, businesses can boost customer engagement and brand loyalty while making their customers feel valued rather than exploited.

To see how the most popular tools for analyzing and visualizing data stack up again big data needs, read Top 7 Data Analytics Tools and Software in 2023.

]]>
A Guide to the Most Common IoT Protocols and Standards 2023 https://www.datamation.com/edge-computing/iot-protocols-and-standards/ Tue, 22 Aug 2023 17:11:57 +0000 https://www.datamation.com/?p=24505 Internet of Things (IoT) devices are seemingly everywhere, from the mobile phones in our pockets and the smart thermostats and doorbell cameras in our homes to the manufacturing facilities where they were made. Protocols and standards ensure that these devices can function correctly and communicate with one another, generating the data that makes them so useful. Here’s a look at the most common IoT protocols and standards.

What are IoT Protocols and Standards?

IoT protocols are established rules about how IoT devices should work and communicate. Standards are similar to protocols, but are used more widely—across an entire industry, for example. Together they ensure that all IoT devices have a minimum level of compatibility with one another and with other related devices and applications.

For instance, a manufacturer might use two different IoT sensors from different brands. As long as both companies follow the same guidelines, the sensors will work on the same network. IoT protocols and standards typically function in a single layer as a distinct part of a larger network—most commonly in the application and middleware layers of a standard five-layer network architecture, although not exclusively. For example, Bluetooth and Wi-Fi operate on the network layer.

Diagram of standard five-layer network architecture via Dr. João Pedro Reis.
Image: Diagram of standard five-layer network architecture via Dr. João Pedro Reis.

Commercial IoT Standards and Protocols

Commercial IoT is a huge, still-growing industry. Interest in smart home tech is creating high demand for devices in the consumer electronics market. As a result, protocols and standards are emerging to ensure consumers get a streamlined, user-friendly experience. While some of these standards are also used in industrial applications, their biggest benefits stand out most in commercial settings. A few commercial IoT standards and protocols are so widely used they have become ubiquitous—like Bluetooth and Wi-Fi, for example.

Bluetooth

It’s hard to imagine consumer electronics today without the Bluetooth standard for wireless device-to-device communication. Every new smartphone, tablet, and laptop includes Bluetooth support as a standard feature.

Bluetooth was one of the first IoT communication protocols to open the door for a boom in consumer IoT devices, such as smartwatches and wireless headphones. It uses wireless personal area networks (WPANs), allowing for short-range data transmission using radio waves.

Bluetooth was originally standardized by the world’s largest technical professional organization, the IEEE, in 2005 under standard IEEE 802.15.1. Though updates ceased in 2018, Bluetooth remains an extremely popular IoT protocol—particularly among consumer electronics.

Data Distribution Service (DDS)

The Data Distribution Service (DDS) protocol and standard is designed for communication across hardware and software platforms. Its main benefits include easy scalability, high reliability, and low-latency connectivity. DDS is great for ensuring all the IoT components in a system can maintain high-quality data transfers.

DDS is popular across commercial and industrial IoT applications. Originally published in 2004 by the Object Management Group, which maintains it today, it is a middleware protocol for standardizing machine-to-machine communication using the publisher-subscriber model.

Diagram of DDS Scaling, via DDS Foundation/Object Management Group Inc.
Image: Diagram of DDS Scaling, via DDS Foundation/Object Management Group Inc.

Matter

Matter is a communication and interoperability standard designed to address the issue of smart home device communication between brands. Many commercial device manufacturers want consumers to buy all their smart home devices from one brand. This isn’t necessarily in the consumer’s best interest, but poor communication between products from different companies may force them to pick a single brand.

Matter ensures that smart home devices from participating manufacturers work together natively. It benefits both manufacturers and consumers. Since companies don’t have to be a one-stop shop, they can instead focus on making great smart thermostats, for example, without worrying about losing money to a competing brand that also makes other products.

Wi-Fi

Wi-Fi is among the oldest IoT standards and one of today’s most well-known and widely used. Its invention dates back to 1942, when actress and inventor Hedy Lamarr patented frequency hopping. It evolved over the decades until the first WiFi standard was created in 1997.

This first set of standards established the Wi-Fi we know today. The IEEE 802.11 family of standards outlines how communication over wireless local area networks (WLANs) should work. It also establishes a minimum data transfer speed of 2 megabytes per second. The IEEE continues to maintain the 802.11 standards, and Wi-Fi is still found in most consumer electronics and commercial IoT devices, such as smart home appliances and sensors.

XMPP

Extensible Messaging and Presence Protocol (XMPP) was originally developed for human-to-human communication in 2002. In the 20-plus years since, it has evolved into a machine-to-machine communication protocol popularly used by smart appliances.

Today, XMPP is an open-source protocol maintained by the XMPP Standards Foundation. It’s a lightweight middleware system that standardizes communication and XML data. XMPP runs in the application layer, where it can provide near-real-time data transfers. This responsiveness, combined with XMPP’s high accessibility, makes it ideal for communicating with smart home devices like appliances.

Industrial IoT Standards and Protocols

The industrial IoT market is among the strongest-performing in the world, which should come as no surprise given the countless applications of IoT in manufacturing, logistics, and construction. Industrial IoT (IIoT) is considered its own distinct niche.

IIoT standards and protocols are becoming increasingly important as businesses grow to rely on their IoT devices more. For instance, a manufacturer in a smart factory might use IIoT sensors to send maintenance alerts, which could affect employee safety. IoT communication standards ensure sensors send real-time alerts successfully, regardless of the brand or model.

Constrained Application Protocol (CoAP)

Constrained Application Protocol, or CoAP, is a protocol that allows IoT devices to use HTTP without excessive power consumption. Launched in 2013, it’s  popular for machine-to-machine (M2M) communication—particularly in industrial applications like supply chain environments.

CoAP lets industrial users include a wider variety of IoT devices in their networks without being restricted by low power capabilities or bandwidth. Its main drawback is a lack of security features. CoAP is somewhat exposed on its own and needs the additional datagram transport layer security (DTLS) protocol to ensure secure data transmission.

Lightweight M2M (LWM2M)

Lightweight M2M, or LWM2M, is a protocol specifically for remote device management in IoT or machine-to-machine environments. It is purpose-built for IoT sensors, making it a highly useful protocol for industrial applications. Its light weight means it doesn’t require much power, storage, or computing resources to run.

LWM2M was originally published in 2017 and is still active and maintained by OMA SpecWorks. The 2020 update to the protocol added compatibility with edge networking and 5G, making LWM2M a cutting-edge standard for today’s industrial environment. LWM2M works over TCP/TLS, MPTT, and HTTP.

MQTT

MQTT is an application-layer protocol for machine-to-machine communication using the publisher-subscriber model. It was developed in 1999 and is a popular open-source protocol for standardizing communication between industrial IoT devices.

MQTT is particularly well-suited for IIoT sensors due to its lightweight nature and tolerance for low bandwidth. Since it doesn’t require much memory space, MQTT is highly compatible with the full range of IIoT devices. It essentially acts as a bridge to applications.

Zigbee

Zigbee is a highly popular network protocol specifically for mesh networks used in automation. Consumer and industrial devices use Zigbee, although its emphasis on automation and various applications makes it ideal for business. It was developed by the Connectivity Standards Alliance, which also created Matter.

Zigbee’s top benefits include low power consumption and a high degree of flexibility. It’s designed for short range, similar to Bluetooth. One feature that’s particularly beneficial in the industrial space is its high level of security. Zigbee includes encryption and authentication by default while staying lightweight. This means industrial users can build a mesh network of IoT devices with security features without using excessive power and computing resources.

Security IoT Standards and Protocols

Cybersecurity standards have always played an important role in the IoT’s development and growth. Some communication-related protocols include security features, but this isn’t always the case. A growing pool of IoT protocols and standards is designed to emphasize cybersecurity. Some of these are add-on rulesets for other offerings—for instance, Wi-Fi Protected Access 2 is one of today’s leading network security protocols to add to Wi-Fi.

Ascon (NIST)

Ascon is the National Institute of Standards and Technology’s (NIST) official standard for IoT encryption, selected in 2023. It is now the formal standard in the U.S. for securing IoT devices and communications.

Ascon is a collection of cryptographic algorithms that provide highly secure encryption without requiring high amounts of power and computing. Implementing Ascon can help IoT device manufacturers be more proactive about preventing cyberattacks and vulnerabilities rather than just responding to them.

DTLS

Datagram Transport Layer Security, or DTLS, is a security protocol for encrypted communications. A datagram is a standard data transfer unit, such as a single message—they are commonly used in gaming, streamed video, or videoconferencing applications.

Designed by the Internet Engineering Task Force, DTLS secures wireless communications so senders and receivers know their messages won’t be intercepted or spied on. It’s a commonly used protocol across commercial and industrial spaces.

Z-Wave

Z-Wave is a proprietary alternative to protocols like Bluetooth and Wi-Fi designed for encrypted mesh network communications, offering more security than its open-source counterparts. It functions on various low-level radio frequencies.

Z-Wave is popular among smart home automation systems, particularly those focusing on security. It is primarily used in consumer electronics and commercial applications but can also be used in industrial environments.

Bottom Line: Understanding IoT Protocols and Standards

IoT devices are a common part of people’s lives. They’re in our homes, our doctors’ offices, our oceans and skies, and businesses increasingly rely on them for a wide range of purposes. Day in and day out, these devices generate massive volumes of data used for business intelligence, competitive analysis, more efficient manufacturing, consumer feedback, and more. Dozens of protocols and standards run in the background to ensure that these devices and sensors work smoothly and securely and can communicate with each other effectively—understanding these protocols can help enterprises make better purchase decisions and build more secure, robust IoT networks.

Read next: Top 7 IoT Analytics Platforms

]]>
Data Management: Types and Challenges https://www.datamation.com/big-data/data-management-types-and-challenges/ Mon, 07 Aug 2023 22:03:11 +0000 https://www.datamation.com/?p=24467 Data management encompasses the processes of gathering, organizing, storing, handling, and securing information according to specific needs, and in compliance with any applicable regulations. While data management can be challenging under the best of circumstances, doing it well can be particularly difficult—especially for enterprises whose increasingly complex data burdens involve massive amounts of data from numerous sources. This article provides a brief guide to understanding and following the associated best practices to help reduce risks.

Types of Data Management

There are numerous data management strategies, and most organizations can expect to use several depending upon their specific needs. Here are eight of the most common types of enterprise data management.

Data Cleansing

Data cleansing involves analyzing collected information for inconsistencies and errors and getting it into the desired format. For example, a database with phone numbers in various formats could contain duplicate entries or other errors that will skew the results if not addressed—data cleansing would mean deduplicating entries and reformatting all the telephone numbers to meet a consistent standard to ensure the reliability and accuracy of the data.

Data Architecture

Data architecture is a visual representation of how information flows through an organization. What are all the sources of data, where is it all stored, which teams handle it, and which applications and devices process it? Detailed answers to these questions help create an applicable data strategy and identify weak spots that could make it challenging or impossible to handle and use the information effectively.

Data Modeling

Data modeling is similar to data architecture, except a data model relates to a specific type of information rather than all of them. These straightforward diagrams show people how specific data moves through the organization, which systems process it, and all the involved departments. A data model might examine information related to customers, third-party partners, employees, or any other relevant group, or show which systems store particular types of information within a company. Those specifics make it easier to find the data later and ensure it’s stored and handled correctly.

Data Pipelines

Data pipelines are an organization’s information pathways. They’re essential for getting information to the desired locations after ingestion. Extract, transform, and load (ETL) is one of the most widely used data pipelines, and involves pulling information from a database, altering it to meet an organization’s standards or formatting needs, and loading it into a new location. People commonly use data pipelines to increase productivity, since they can automate many associated processes. They also optimize usability since the information can move quickly from one place to another.

Data Cataloging

A data catalog details a company’s information resources and helps users search through them–for example, through a search bar interface that accepts keywords, short phrases, tags, or labels. Most catalogs allow further specificity, such as letting people find information that fits certain parameters. That might mean someone can only search for active customers versus all of them in the database. A 2023 survey of company leaders found 41 percent lack the understanding to fully benefit from their organizations’ data assets, and 30 percent said the amount of information overwhelmed them. There’s no single solution for those issues, but creating and maintaining a data catalog could help make that information more visible, actionable, and manageable.

Data Access Control

For companies grappling with the best ways to capitalize on data, access control is often part of the discussion. Who gets access to what data can become a battle to find the balance between accessibility and security. Data breaches don’t always happen intentionally. A 2023 insider risk study showed accidental data breaches were the most concerning for the business leaders polled. Additionally, 93 percent of respondents said hybrid-remote work arrangements have increased the need for data security training. One common approach is to restrict access according to role, giving everyone the information needed to do their jobs while mitigating security risks. But “mitigating” is not “eliminating,” and companies need to remain responsive and alert in case of a data breach—some companies only have 72-hour windows for reporting severe incidents to appropriate authorities.

Data Processing

Data processing involves turning raw data into usable information through a variety of methods. It includes collecting, manipulating, and transforming data, and can be done manually, but increasingly it is being automated to accelerate workflows. Advanced technologies like artificial intelligence and optical character recognition can process higher volumes with fewer errors than humans working manually. Data processing makes it possible to identify patterns and trends, discover new information, make the most of available resources, improve efficiencies, and make better decisions. It can be used to track consumer trends, measure consumer behavior, and create customer segments.

Data Governance

Data governance is the process of creating and enforcing policies and standards for data in an organization to ensure its security, protect privacy, and meet compliance requirements. To be successful, these policies require comprehensive frameworks that rely on participation from people at all levels of the company. As businesses store and use more and more data from diverse sources, including Internet of Things (IoT) devices, data governance programs become increasingly important to improve quality, remove silos, and make data accessible and secure.

Benefits of Data Management

Data management offers enterprises a wide range of advantages. Here are just a few.

  • Avoiding regulatory fines. Data management can reduce companies’ chances of being fined for improper data handling. Consider an April 2023 case in which regulators in Britain fined TikTok £12.7 million for insufficient screening to prevent underage users from using the platform. Data associated with minors, medical patients, or credit card details requires specific processing to comply with regulations.
  • Protecting critical files. Data management can reduce the chance of important files being deleted accidentally by users who don’t recognize them by name. Because effective data management shows the purpose of information and how it moves through an organization, users are less likely to delete things they believe are unimportant.
  • Facilitating responsible information-sharing. Some decision-makers start emphasizing data management to encourage information sharing across entities—for example, the World Health Organization requires research data-sharing for all initiatives it funds or conducts, a policy it says supports science and public health.
  • Improving engagement and security. A 2023 survey found 80 percent of public sector entities have started creating collaborative ecosystems where users can share data, which has improved citizen engagement and strengthened cybersecurity efforts. Enterprises can achieve similar results among employees.

Data Management Challenges

Even with well-thought-out, detailed plans, enterprises can expect data management challenges. A 2022 Deloitte study revealed that 45 percent of tech industry leaders cited gathering and protecting growing data volumes as their top challenges, while 32 percent of respondents cited the changing worldwide regulatory landscape.

Emerging technologies can present another challenge. For example, some employees have unintentionally revealed company secrets by feeding proprietary data into the ChatGPT chatbot to get help fixing broken code. OpenAI, the company behind the tool, has since introduced a feature that allows users to turn off their chat histories, preventing ChatGPT from using information in those messages to train future algorithms.

Lack of visibility can also make data management more difficult. Cloud tools facilitate access to data, but they can compromise visibility—a 2023 study showed only 40 percent of parties using the cloud have total visibility into their data’s location, which is particularly problematic if the information requires specific handling due to its content.

Solving problems within your organization starts with understanding the issues and associated ramifications. Take the time to get feedback from people at every level of the organization. Ask them how the identified obstacles affect their workflows and what they’d do to improve the situation.

Bottom Line: How to Implement Data Management 

Enterprises ironing out their data management plans should remain mindful of best practices, including being transparent with customers about why companies need their information and how they protect it. Stating the relevant information in easy-to-understand language can boost consumer trust and confidence. They should also thoroughly vet external service providers handling their data. In the eyes of their customers, and possibly authorities, the responsibility—and the blame—is theirs, not the third parties’.

Businesses should consider continuing education a foundational part of any data management plan. New processes, product launches, and expanding teams can change how a company handles information, and training employees on the latest preventive measures can help everyone abide by company data-handling rules and reduce mistakes. They should also clarify how data usage connects to the organization’s goals.

Organizations are most likely to optimize outcomes by fostering a data-driven culture. Everyone must understand how their actions can influence a company’s success and reputation. Adhering to an agreed-upon file-naming structure can make information easier to find and use later.

Data management can become complex, especially as information volumes rise. However, becoming familiar with various types, anticipating the frequently seen advantages, and recognizing common pitfalls will help you get the most out of your efforts.

To learn more about how to implement a successful data management strategy at your organization, read Top 10 Data Management Best Practices for 2023.

]]>
5 Ways to Use Big Data to Gain Customer Insights https://www.datamation.com/big-data/big-data-use-cases/ Thu, 13 Jul 2023 15:50:00 +0000 http://datamation.com/2017/06/21/big-data-use-cases/ Industries are leveraging extensive data services for competitive advantage, using high-volume, fast-incoming, and exceedingly diverse data to keep connected with consumers. Data provides insights enterprises need to stay relevant, but the vast volumes of it presents challenges around storage requirements, collection compliance, and time investments.

This article presents five use cases for enterprises to take advantage of big data to gain insights into customer behavior and improve their product and services offerings.

1. Procter & Gamble (P&G)

P&G has been in business around the world for almost two centuries. One reason it keeps its doors open is by making timeless products—the company uses big data to remain at the forefront of consumer demand by knowing what to keep on store shelves. P&G collaborated with Microsoft and its Azure suite to maximize the power of artificial intelligence (AI) to provide these much-needed insights.

Chief Data and Analytics Officer Guy Peri said the company’s multi-cloud big data solution is “an elaborately planned disruption” that starts and stops its business strategies as real-time data funnels into its databases. This data gives P&G the ability to be proactive as it identifies trends and immediately reacts when unexpected changes occur.

“In order for P&G to understand and best serve consumers,” Peri said, “we need to drive a data-enabled culture and operationalize algorithms into every major business decision.”

When companies get as large as P&G, data fragmentation is inevitable. One company branch has a separate data set, for example, while another across the globe has exclusive information that would benefit the greater whole. P&G’s big data strategy focused on connecting these silos to enhance its data analytics plan multifold.

Combining data stores was step one. Then the company chose three challenges for data to address:

  • Create a more resilient supply chain
  • Improve retail execution
  • Design and deliver products with improved packaging

Big data can show how well suppliers perform, enhancing the strength of B2B relationships and encouraging supply chain diversification if specific products or parts receive frequent delays. Seeing what P&G can and cannot obtain from third-party suppliers can inform the company’s retail execution knowledge and determine, for example, whether it’s worth finding a supplier for a product that doesn’t sell well.

Finally, constant data flow allows P&G to constantly innovate existing products, whether for improving its cost-effectiveness from an internal point-of-view or for quality on the customer side.

Industry: Consumer goods
Big data product: Azure Synapse Analytics

Outcomes:

  • Connect segmented data silos
  • Respond in real-time to product adjustments and improvements
  • Keep top-selling products on shelves

2. Starbucks

With 90 million coffee purchases and 25,000 global storefronts connected to apps and store tech, Starbucks has a wealth of information to guide marketing and sales decisions. The company’s primary resource is its proprietary app, which connects customers to the reward program and catalogs every purchase and store visit. By learning, for example, how frequently customers make to-go orders ahead of time via the app rather than making impromptu drive-through purchases provides valuable insights about customer preferences and behavior.

Starbucks is able to gain these insights because of its innovative Digital Flywheel strategy, which melds digital and in-person customer experiences to improve personalization, ordering, rewards, and purchasing.

As the app collects customer data, AI and big data combine to provide special offers for customers based on their favorite treats. Coupons and discounts relevant to their buying experience means they’re more likely to make subsequent purchases. By identifying trends—similar behaviors among millions of customers—Starbucks can also make decisions about which menu items are working and which are not.

For the company to continue to grow and change, it needs to keep customers interested in trying new things. The Starbucks app will recommend related drinks based on customer tastes, much as the data-driven Netflix does with subscribers’ viewing choices, expanding their interest in a broader array of products. These suggestions are tailored based on location-specific availability and are smart enough to recommend hot drinks on colder days by connecting with local weather services. The deeper the commitment to a wider variety of menu items, the more loyal the customer remains.

Looking at the big picture, Starbucks uses the collected personal data to inform campaigns on social media and shift the brand’s goals to initiatives that keep profitability sky-high. As Starbucks’ Chief Strategy Officer, Matt Ryan, explained, “This fundamental modernization of our technology stack will replace legacy rewards and ordering functionality with the new scalable cloud-based platform for rewards and ordering, improved customer data organization, and tighter integration with store-based operating systems, including inventory and production management.”

Industry: Food and beverage
Big data product: Starbucks rewards-connected app

Outcomes:

  • Personalized customer rewards offers
  • Thorough transaction cataloging
  • Streamlined audience research for campaigns and menus

3. Spotify

Music streaming platforms are ubiquitous nowadays, but something sets Spotify apart from the competition—in 2017, the music megagiant acquired AI music company Niland and blockchain company Mediachain to enhance big data strategies.

The company has always embraced big data to keep listeners coming back for more. Spotify has information on every second of a listener’s music taste and habits, and the platform has hyper-specific metadata that separates songs into genres and bundles them into Daily Mixes that continue to expose listeners to new music.

Spotify uses programming languages like Java and Apache alongside big data to make connections between how long users play each song, what devices they play them on, and if they play them from a shuffled playlist or preselected album. This lets the company know how to adjust settings and toggles so users stay listening without interruption within the artists and genres in which they’re most likely to get invested.

Additionally, it helps Spotify promote its small and large music partners and promotions—without substantial data, how would the company know who to market an up-and-coming artist to? It also gives artists a way to create something resembling personal communications with fans. The Spotify Wrapped feature presents each listener with a year-in-review based on the last 12 months of big data, and artists can use this information to offer their top listeners early announcements for tours or discounts on merchandise.

Industry: Music
Big data products: Python and Apache Hadoop, among others

Outcomes:

  • Experiences like Spotify Wrapped and Discover Weekly
  • Increased fan/artist engagement through personalized communications
  • Curated playlist recommendations

4. Marriott

Since the COVID-19 pandemic, the hospitality industry has been experiencing a never-before-seen boom. Isolation made people lust for travel. The best way for companies like Marriott to stay in touch with customers is through big data. There is significant nuance in dictating pricing based on peak times and demand while considering economic situations like inflation. They must maintain processing customer feedback from survey data to make changes in their chains, and must stay in touch with local events to know how to manage expectations given major musical tours or yearly festivals that bring spikes in tourism.

Marriott’s digital services department oversees incoming data to ensure accurate and sensible reservation experiences. How are cancellation trends impacting availability for customers? How fair is dynamic pricing, and will it increase or decrease bookings? What are customer burdens between booking and checking in that cause resistance from subsequent reservations?

The company collects this information from several silos, like booking platforms and apps, but the primary source is the Marriott Bonvoy platform. It uses big data processing to provide members with tailored rewards experiences, and uses AI integration to reactively adjust against competitors to remain the best choice for travelers.

The app allows customers to receive a next-level, contactless hotel experience with such features as:

  • Mobile check-in with facial recognition
  • Alerts when the room is ready
  • Digital keys to prevent lost or demagnetized entry
  • Connectivity with in-hotel Amazon Echos
  • Customer feedback submissions
  • Room service or cleaning requests

Industry: Hospitality
Big data product: Marriott Bonvoy platform

Outcomes:

  • Dynamic, predictive room pricing
  • Suggestions for tech-integrated lodging experiences
  • Increased awareness of consumer booking and cancellation motivations

5. Mint

Mint is one of the most popular personal finance management tools on the market, with its functional, easy-to-learn user interface and a suite of functionalities. Because it offers a money management service that customers use regardless of their spending behaviors, Mint does not use big data to determine customer spending habits.

Instead, the fast-growing company analyzes spending categories on customer-input and bank-derived data to educate customers rather than to inform its internal business decisions. Mint’s data analysis processes can also detect issues like fraud, adding to the value of the services it provides to customers.

With 13 million customer accounts and billions of transactions, Mint has created an incredible picture of the world’s financial wellness about economic trends. It’s plain to see how these add up—Mint uses big data to help customers see snapshots of their money progress and net worth, but as a byproduct, it became a collage of spending trends that reflect modern history, giving the company information about debt balances, business employee payment records, financial goals, investment portfolios, spending categories, and bank statements.

This led Mint to implement the large language model (LLM) AI-powered operating system GenOS. GenOS has a few primary components that all circle, generating fast, accurate financial assistance for everything from taxes to budgeting while keeping data safe. The power behind generative AI gives Mint a distinctive perspective. Instead of providing advice based on historical user data, the company can deliver fresh, attentive responses based on the specific customer.

Industry: Fintech
Big data product: Intuit GenOS

Outcomes:

  • Snapshots of national spending behavior
  • Data-based financial advice
  • Actionable customer insights

Business Intelligence (BI) Tools to Help With Big Data

Big data is its own industry with a hand in nearly every other industry. For companies to make products informed by big data, they need BI tools to drive digital transformation and data discovery. Ideally, BI tools integrate with open-source frameworks like Hadoop, Apache’s collection of open-source software utilities that makes it possible to use a network of computers to process massive amounts of data. Here are a few of the most popular BI tools to help with big data.

Datameer icon

Datameer

Datameer models data in its user-friendly interface, merging data sets, analytics, and other company system assets into a single dashboard. The appeal of Datameer is its scalability and interactivity, making it easy to train employees even without coding or analytics experience.

Qlik icon

QlikSense

QlikSense has a unique touchscreen interface that works well for companies with users who work on the go. Its AI-powered search and conversational analytics feature is specific to Qlik and gives companies a new way to communicate with their databases to reveal even more insights than if they had just performed regular oversight.

Zoho Analytics icon

Zoho Analytics

Zoho Analytics provides visualizations while syncing to other connected devices and easily integrated programs. Syncing is a crucial aspect of these programs, especially when data silos remain fragmented. It collects information to form cohesive reports and provides it all in one crisp dashboard.

Bottom Line: Big Data in Practice

Companies that use big data prove they want to remain top-ranking. Because they rely on technologies, they make more personalized customer experiences and more accurate decisions on where to take their companies because the data tells them the best road.

Whether to explain national purchasing decisions or provide novel song recommendations, companies do everything in big data’s power to keep consumers engaged in an information-loaded world. If they succeed, they will see their highest revenue in history while streamlining and bolstering their business structure.

Read next: Top 23 Big Data Companies

]]>
5 Business Use Cases for Artificial Intelligence (AI) https://www.datamation.com/artificial-intelligence/artificial-intelligence-use-cases/ Mon, 10 Jul 2023 14:45:19 +0000 https://www.datamation.com/?p=21882 Artificial intelligence (AI) is one of the most revolutionary technologies of the modern era. Despite still being relatively new, it’s already redefined how many business functions work, and as it develops, new use cases across industries keep emerging.

AI is a versatile technology, so its potential applications span a huge range of sectors and purposes. Here’s a look at how five leading companies in different industries use it.

1. Facebook

Facebook is one of the most recognizable names in social media. Even though the sector is more crowded now than when it first started, Facebook remains the second most popular social platform in the U.S., according to Pew. With so many users, ensuring everyone sees what’s most relevant to them can be challenging.

In 2016, Facebook introduced DeepText, a proprietary deep-learning text analysis model, to gain a better understanding of user intent and interactions. DeepText can analyze thousands of posts per second in more than 20 languages to enable more personalized suggestions, identify spam, and highlight relevant content.

Industry: Social Media
AI Product: DeepText
Outcomes:

  • More accurate user intent analytics
  • Better spam and cyberbullying responses
  • Transparency into user sentiment across locations and languages

2. Daimler Trucks Asia

Daimler Trucks Asia (DTA) is a branch of the world’s largest truck manufacturer, so its internal processes have ripple effects throughout global supply chains. In light of this massive industry presence, catching potential safety or quality issues early is crucial.

DTA’s old, manual quality control system could take up to two years to identify and address safety issues. As those losses mounted, the company turned to Deloitte to build a custom AI analytics platform to streamline the process. The resulting solution, dubbed proactive sensing, analyzes a wide range of data, from vehicle metrics to social media engagement, to predict and quantify safety issues earlier and more accurately.

Because AI is better than humans at spotting trends in data, the solution can find indicators of potential problems human analysts may overlook. It can also predict issues before they happen, enabling faster, more cost-effective responses.

Industry: Vehicle manufacturing
AI Product: Deloitte
Outcomes:

  • 50% reduction in issue detection time
  • $8 million reduction in warranty costs in first two years
  • Preserved client reputation

Learn more about artificial intelligence in supply chains.

3. Humana

Humana is one of the largest insurers in the U.S., serving over 13 million customers across the country. Managing queries from all those customers can be a challenge, especially when you have to determine which cases are the most pressing. Humana had a voice chatbot to field these calls, but it transferred too many to human agents or expensive outsourced call centers when most calls were about routine questions.

Using an AI solution from IBM, Humana built a new chatbot that could understand conversational language better, leading to more helpful responses. The bot can interpret a wider range of customer needs more accurately, enabling it to provide customized responses and help instead of sending lengthy FAQ pages or directing calls to human agents.

Industry: Insurance
AI Product: IBM Watson
Outcomes:

  • Handled inquiries at a third of the cost
  • Doubled response rate
  • Faster customer care responses

Learn more about how to use chatbots to improve customer service.

4. Panasonic

Natural language processing (NLP) AI services have uses outside of customer-facing chatbots, too. Panasonic, a global leader in electronic manufacturing, uses AI translation services to enable easier communication between its more than 240,000 employees and 500 affiliate companies.

Using global teams gives companies a wider talent pool and promotes flexibility, but language barriers can hamper productivity. To get around that issue, Panasonic implemented an AI system to translate documents between English and Japanese with remarkable speed and accuracy.

Another crucial advantage of AI over manual translation is that it provides more security over trade secrets. Panasonic found that other methods and services may expose sensitive documents to data leakage. AI, by contrast, removes intermediaries and can work securely, ensuring classified company documents stay secure through the translation process.

Industry: Electronics Manufacturing
AI Product: MiraiTranslate
Outcomes:

  • Faster, more accurate translations
  • Protected trade secrets
  • Enabled more productive global collaboration

5. Česká Spořitelna

Banking is another industry with high security and productivity needs, making it an ideal use case for AI. Česká Spořitelna, the largest retail bank in the Czech Republic, uses the technology extensively.

As a customer-facing bank, Česká Spořitelna uses AI to analyze its ad performance and user trends to inform more effective marketing strategies and boost lead generation. It also applies AI to its data processing practices to automate regulatory compliance. Similarly, AI helps the bank automate credit risk scoring, leading to faster, more reliable loan approvals.

With so many use cases, it’s unsurprising that the AI-in-banking market could be worth more than $64 billion by 2030, according to ReHack. AI is adept at understanding the real-world implications of numbers, making it the ideal tool for many banking processes.

Industry: Banking
AI Product: Keboola
Outcomes:

  • Streamlined credit risk scoring
  • Improved regulatory compliance
  • More effective marketing campaigns

Bottom Line: AI Business Use Cases

As these five use cases highlight, AI has uses across virtually every industry. This technology serves many purposes, from language comprehension to data analysis to trend prediction. Such a wide range of applications makes it a promising technology for any modern business.

Read next: AI in Education–The Future of Teaching.

]]>
5 Digital Transformation Examples https://www.datamation.com/trends/5-digital-transformation-examples/ Mon, 08 May 2023 19:33:42 +0000 https://www.datamation.com/?p=24107 The term “digital transformation” can be defined broadly, encompassing many ideas and largely tied to an organization’s specific goals. For example, a digital transformation might entail a business switching from paper forms to a cloud-based system for better recordkeeping, or developing an application to improve customer engagement options.

Typically, pursuing a digital transformation means prioritizing tools that enhance a company’s operations, profits, growth potential and more. Here are five examples of what a digital transformation could look like for a modern business to use as inspiration for progress at your own organization.

1. Digital Transformation Creates Strategic Advantage for a Healthcare System

Sometimes, a surprising revelation triggers the demand for a digital transformation. That was the case when leaders of a global and academic healthcare system realized how often people used mobile devices to get information about available health-related services. Data showed 40% of website visitors used mobile devices to get information about the system’s offerings.

This healthcare system was already successful, achieving $5.2 billion in revenue and attracting 30,000 employees and more than 4,000 physicians. However, one of the keys to becoming and remaining competitive is recognizing the need for change and responding accordingly. That meant relying on next-generation digital technologies to improve patient engagement and boost brand value.

Although the healthcare system already had a mobile app, customers wanted more than it offered. Early steps in the digital transformation process involved viewing offerings from patient and caregiver perspectives in multiple contexts. Officials used that information to discuss the characteristics of optimal digital experiences and engaged a service provider to develop a new app.

Next, the client narrowed down features for the first release of the mobile app. The service provider mobilized tech specialists to meet those demands, and ultimately these digital transformation plans created a fielded unified experience application for the healthcare client just six months after initial planning.

Within three months, the application had a 64% month-over-month adoption rate across a target pool of 500,000 patients. Plus, the healthcare system expects $10 million in total savings due to associated physician efficiency gains and streamlined procedures in other clinical and back-office operations. Since the app allows co-pay authorizations when patients arrive, the client may see an additional $10 million from increased collections.

2. A Data-Centric Plan Helps a Multinational Food and Beverage Brand Excel

Nestlé has over 150 years of history and thousands of brands under its company umbrella. Executives continually look for practical and proven ways to enhance operations, and that often means being open to using data and adopting technologies.

The company focused on maintaining privacy, connecting with consumers, and pursuing ongoing experimentation during a recent digital transformation, according to Global Chief Marketing Officer Aude Gandon.

One component of that was developing a future-proof first-party strategy emphasizing data safeguarding and privacy, some aspects of which included using consent-mode features within Google Analytics and developing a global advertising technology roadmap.

Ensuring the digital transformation helps Nestlé connect with consumers means reaching 400 million customers with the company’s first-party database by 2025. That information will allow brands under the company’s umbrella to extract valuable insights that improve competitiveness. One Nestlé company used first-party data to achieve a 25% boost in ad spend return by improving connectedness with customers during seasonal events.

Finally, the part of the digital transformation plan focusing on ongoing experimentation heavily relies on cloud computing. In one project related to a coffee brand in Thailand, employees sent large volumes of data from past campaigns to Google Cloud. They then used machine learning algorithms to predict the most appropriate creative messages to show to specific YouTube audiences. This tactic caused a 17% increase in cost-per-view metrics and a 12% lift in ad recall.

These examples show why aligning long-term plans with certain focal points is often useful. Otherwise, it could become too easy to get off track with the digital transformation. That’s especially likely to happen if numerous decision-makers are weighing in with different opinions and not agreeing about the best ways forward.

3. Digital Supply Chain Planning Improves Pandemic Coping

Many digital transformations involve exploring how new technologies help companies reach their goals. For example, some business leaders are examining how Non-Fungible tokens (NFTs) could strengthen their supply chains. Improved supply chain visibility is a priority for many leaders, regardless of industry, and some experts believe NFTs could track parts from various locations, allowing better delay forecasting.

In one case, DuPont utilized digital supply chain planning through scenario modeling that fostered better preparedness. The company’s supply chain experts scrambled to cope with the global market’s unpredictability in early 2020 as the COVID-19 pandemic worsened. They found the systems they’d previously used no longer met needs in the unprecedented circumstances. Leaders decided DuPont needed a customized digital platform.

The resulting creation was a digital tool that allowed people to plan for what-if scenarios that allowed for better business decisions, even in an uncertain environment. The system uses a custom-built algorithm and open-source platform that enable people to run multiple potential scenarios at once within minutes. That saves time and provides trusted results.

Users can also get financial projections, capacity planning, and inventory planning for up to two years into the future. That makes it easier for supply chain executives to make the choices they need to get products to those needing them. This digital platform supports distributing and manufacturing of more than 1,000 products that reach people worldwide. It also allows future scenario testing for nine supply chain categories across 75 locations.

DuPont’s supply chain planners run more than 20 scenarios per month. The company also has over two dozen people trained to use the tool.

4. Electronics Brand Aims for an Omnichannel Approach

The digital environments in today’s business environments are always changing. That’s because improvements and advancements continually happen, pushing leaders to evolve their companies.

In one case, Currys, a market-leading brand in the United Kingdom’s tech retail market, pursued a transformation that would turn it into a digital-first omnichannel retailer. Leaders already knew that 60% of the company’s customers preferred to shop across multiple channels, so the push to suit shoppers’ wants made sense.

Executives had a three-pillar plan for making improvements:

  • Ensuring an easy shopping experience
  • Providing a connected customer journey
  • Creating capable and committed employees

Using a Customer Relationship Management (CRM) platform allowed for meeting all those goals. It showed a 360-degree view of marketing, sales, and service interactions associated with every customer. Company decision-makers also used specialty software to update legacy infrastructure and cloud-based tools to give people the information they needed from any location.

These improvements created digital enhancements that affected online and in-store shopping experiences. They also meant employees could focus on building lasting customer relationships rather than merely serving people during single purchases and never attempting to lengthen the relationship.

Additionally, the digital transformation gave customers more personalized experiences, providing them with relevant prices and other product information to reduce friction. Whether people want to buy a new TV or a kitchen appliance, they’ll appreciate having need-to-know information that guides their interactions.

5. Cloud Computing Helps the University of Bristol Reach More Students

One of the primary goals of a digital transformation is to unlock more opportunities for a respective organization. Such was the case when University of Bristol officials wanted to branch out into remote learning. That would provide educational options for more people than the approximately 28,000 students per year who take in-person classes annually.

University executives chose a cloud technology solution and realized that moving into the realm of online courses would bring multiple benefits. For example, it would open new commercial appeal for the higher education institution while giving learners the freedom and flexibility to attend regardless of their locations or backgrounds.

Leaders also believe this digital transformation will improve its academic research arm. The University of Bristol counts international meteorological bodies, pharmaceutical researchers, and health care organizations as partners. All those parties have different requirements, but partnering with academic institutions allows researchers to take advantage of computing and storage resources that further their projects.

The university’s leaders knew their digital transformation would span multiple years. However, they did the smart thing from the start and worked with technology experts that could advise them every step of the way. These parties assisted in defining a target operating model and teaching the university’s employees the new skills they’d need to succeed with the cloud-based technology.

Much of this tech support happened remotely, although it went smoothly. That emphasizes the provider’s knowledge in helping clients digitally transform without the potential restrictions of geographical boundaries.

Digital Transformations Take Many Forms

These five examples prove there’s no single way to enact a digital transformation within a company. However, businesses are most likely to get the best results when they take the time to determine what would most severely limit future growth and success. Paying attention to those issues could provide decision-makers with the evidence they need that a digital transformation must happen soon. They’ll also better understand which problems to tackle first when making digital improvements.

]]>
Trends in Low-Code/No-Code https://www.datamation.com/trends/trends-in-low-code-no-code/ Mon, 08 May 2023 13:53:04 +0000 https://www.datamation.com/?p=24094 Coding is an in-demand skill that usually requires highly specialized knowledge. However, the rapid rise of Low-Code/No-Code (LC/NC) platforms has allowed people to engage in web and app development in a new way. People can drag and drop components in an interface and link them to create applications.

Here are some important trends about the LC/NC landscape and how people feel about it.

1. People Viewing Low-Code Solutions as Core Technologies

Not so long ago, many tech decision-makers saw LC/NC tools as niche products that grabbed their attention but weren’t necessarily critical for day-to-day business operations. However, that’s starting to change, particularly as many companies had to evolve due to challenges brought by the COVID-19 pandemic.

A 2022 survey from Low-Code platform provider Mendix found 94% of companies across various industries used Low-Code solutions in 2022, up from 77% in 2021. Moreover, 69% of respondents saw such offerings as crisis technologies during the pandemic but now view them as core to their business models.

Additionally, half of the respondents perceive Low-Code products as filling gaps in their IT departments, while 43% see them as able to assist with production engineering needs. Another notable takeaway was that four in 10 respondents now use Low-Code platforms for mission-critical applications. In fact, it’s estimated that 63% of app development activity will be done through Low-Code platforms.

Company leaders also find plenty of ways these platforms can help them. The Mendix survey showed 63% used such tools to address problems with logistics, the supply chain, and transportation. About 32% of retailers said Low-Code tools helped them offer curbside shopping pickups. Additionally, half of public sector respondents mentioned improved planning and management of resources and enhanced service access among the benefits.

However, such efforts sometimes come with issues that are not directly related to Low-Code products. For example, one-third of respondents felt frustrated by their company’s legacy systems. That’s why 39% have required proof that Low-Code offerings will integrate with them. That’s smart information to ask for since Low-Code products are relatively new. Getting the assurance of successful integration with legacy systems avoids surprises.

Indeed, some people have yet to try Low-Code solutions. However, this study shows adoption rates are climbing. When that happens, individuals who previously felt unsure will become more confident about exploring the possibilities.

2. LC/NC Providing an Option Beyond Hiring Developers

Since developers are in high demand, many company leaders must devote significant resources to hiring them. That’s often easier said than done, especially if there are relatively few developers in the job market or those within it have plenty of choices regarding where to work.

A 2023 study gave a closer look at the job market for developers and those who want to hire them. One finding was that 53% of developers consider salary the most important factor of a potential job. Another 38% of respondents mentioned having a good work-life balance, and 28% wanted the option to work remotely.

Another finding was that 52% of developers plan to leave their jobs within the next year. Among that group, 67% said the desire to get a higher salary was the main reason behind that decision. That finding suggests managers and human resources professionals cannot merely assume they’ll be able to retain developers after hiring them. These workers know they’re in demand, so they can afford to be picky about finding and staying at the most suitable workplaces.

On the recruitment side, 23% of tech recruiters said they plan to hire at least 50 developers this year. About 42% cited developer retention as their top priority for 2023, and 46% of recruitment professionals said they’d have bigger budgets this year than last. However, it’s not a given that all recruits will find and attract all the developers they need. This is where LC/NC platforms will prove particularly useful.

They won’t eliminate the need to hire developers, but an LC/NC tool could meet business needs and fill gaps while the hiring process is ongoing. Companies can become more nimble and able to respond to marketplace changes faster than they otherwise might.

3. An Appealing Option for Small Businesses

People who own or operate small businesses often face additional challenges related to resource usage and being able to pursue growth like larger companies can. However, LC/NC platforms could change that, and many analysts who have examined the matter believe they will.

This is not the first time coding has gone through a major change. Object-oriented programming arrived in the 1980s and allowed people to design programs with objects. Similarly, the 1990s and 2000s necessitated using different types of code to meet emerging needs. Companies of all sizes must adapt to stay competitive, but it’s often more challenging for smaller enterprises to make those changes.

An Accenture report clarified why LC/NC tools are vital for helping small to medium-sized businesses (SMBs) adapt to the changing landscape and harness all their tech offerings. In the opening part of the document, the authors mention the e-commerce platform Shopify and how it was instrumental in enabling companies to keep operating once the COVID-19 pandemic closed many physical stores. They believe LC/NC tools will have an equal or bigger impact on small and medium businesses.

A statistic cited in the report mentioned that 70% of small businesses are ramping up their digitization efforts worldwide. Low-Code/No-Code tools are instrumental in allowing that to happen. Another finding was that one in five small and mid-size businesses (SMBs) began searching for LC/NC platforms because of difficulties finding digitally fluent workers.

Moreover, 47% of respondents believed enterprise-level IT solutions don’t meet their needs because those providing them don’t understand the associated challenges. They said the shift toward LC/NC among SMBs illustrates that issue. When offerings meant for larger companies fail to fill gaps, people will look elsewhere for alternative solutions.

4. Turning Shadow IT Into an Asset With LC/NC

Shadow IT occurs when people use IT products without explicit workplace approval. Many tech professionals view it as a major problem. Consider how a 2022 study revealed that 69% of tech executives view shadow IT as a primary concern related to adopting cloud or Software-as-a-Service tools. Another 52% of respondents said individual employees purchase apps for use at work without the IT department’s knowledge.

However, some advocate turning shadow IT into an asset with LC/NC apps. A McKinsey report explained how Low-Code/No-Code products could allow organizations to increase innovation and speed when business and IT teams collaborate.

The report detailed three specific ways to achieve those aims:

  • Using enterprise-grade LC/NC platforms to customize and expand a product’s out-of-the-box capabilities
  • Augmenting existing products to give them new features and capabilities
  • Prototyping new ideas to establish use cases for new business applications

Many people use shadow IT products because the approved offerings don’t meet all their needs. These individuals are frequently unaware that they’re breaking any rules at their organizations and are merely trying to keep their workflows productive.

IT decision-makers should get feedback about any shortcomings associated with the approved products for employees to use. They could use LC/NC platforms to address those weak points with new products or options that alter what existing tools can do. After all, Low-Code and No-Code products typically shorten the overall development cycle, making workforces better equipped faster than traditional methods allow.

5. Company Representatives Discussing LC/NC More Often

Low-Code and No-Code technologies have a much better chance of succeeding when business leaders understand the advantages and are open to using them. A 2023 report showed a modest but notable rise in LC/NC discussions at companies.

The data indicated an 18% year-on-year increase in such talks in 2022 versus 2021. That’s important because discussions are often critical to help people with corporate buying power determine if they want to pursue certain possibilities.

Numerous software company representatives have pondered using LC/NC tools to reach internal aims faster than they could with conventional coding. Most business leaders know the importance of watching for changes in their respective industries and responding promptly. Otherwise, people who wait too long to react could face challenges catching up with their peers.

It also helps when companies have specific individuals who champion LC/NC products. Humans naturally resist change, even when they can see some of the advantages. However, when someone they know, trust, and respect encourages them to be open to Low-Code and No-Code products, it becomes more likely they’ll eventually embrace them.

These Low-Code/No-Code Trends Matter

Low-Code and No-Code platforms are still evolving, along with people’s opinions of them. Seeing how things play out in the coming months and years will be interesting. In any case, the five trends here are important to watch because they highlight the current state of things. Even if things change later, these ongoing patterns in LC/NC adoption and usage will likely shape what’s ahead.

]]>
The Future of Low Code No Code https://www.datamation.com/trends/the-future-of-low-code-no-code/ Fri, 05 May 2023 20:30:25 +0000 https://www.datamation.com/?p=24091 Low-Code/No-Code (LC/NC) platforms are revolutionizing the software development industry. Today, anyone can use them to create their own app, tool, or website without existing programming knowledge. How will Low-Code/No-Code platforms evolve in the coming years, and how are they forcing the industry itself to evolve?

Evolving Applications of Low-Code/No-Code

The LC/NC market is expected to grow 20% in 2023 alone and reach an estimated value of $26.9 billion. This technology has gained popularity in recent years as a means of closing skill gaps and making app and web development more efficient. However, it still lacks the flexibility of custom apps designed more traditionally by skilled developers.

Current applications for LC/NC development sit somewhere between off-the-shelf and custom solutions. How will these applications change in the next few years? Here are some of the areas in which developers can expect to see change.

Robotic Process Automation (RPA)

Robotic Process Automation is one of today’s most common applications for Low-Code/No-Code platforms. LC/NC is a great fit for RPA because it usually requires simplifying something that already exists, such as automating a specific workflow.

Low-Code/No-Code developers already know what they need from an app they want to build, so they can shortcut the process without significant User Experience (UX) design. The LC/NC approaches give new developers the tools to build and integrate a straightforward RPA app in the minimum turnaround time possible.

In the future, LC/NC platforms may include more advanced RPA capabilities, and may be able to integrate data from more sources or handle more tasks in a single app. This particular use case may lean more toward No-Code platforms, since automation will soon be necessary for more jobs. As more people without coding experience will seek the ability to use automation, the demand for RPA-specific No-Code platforms will increase.

Simple Web and App Development

The main apps and tools for which Low-Code/No-Code approaches are currently ideal are typically simple in scope and limited in distribution. Most often, a user develops an app solely for in-house use, for their own personal use, or for a one-time event or conference.

For example, Low-Code/No-Code is commonly used for replacing legacy systems. Digital transformation spending is expected to total $3.4 trillion worldwide by 2026. Businesses must evolve their operations and technology to keep up, but that can be difficult without a large development team. Low-Code/No-Code platforms allow companies to upgrade technologies and workflows without in-house developers.

Low-Code/No-Code development platforms aren’t intended for large-scale applications, nor are they ideal for supporting hundreds of users or managing massive quantities of data. In the future, this could change as the technology becomes more capable. For example, Artificial Intelligence (AI) could make it easier to create complex apps without requiring coding knowledge.

Challenges and Innovations in Low-Code/No-Code

How will the capabilities of Low-Code/No-Code platforms evolve in the future? What new applications are emerging? They will increasingly shift toward zero necessary IT involvement in the development process as AI makes it possible for nearly anyone to create original, customized code.

Generative AI-Powered Coding

Generative AI is changing the game in app and web development. Platforms like ChatGPT are opening the door for anyone to try developing their own app or website with zero prior experience. Users can type in a text prompt explaining what they want, and ChatGPT will do its best to generate code that fits the bill. It can also help debug code that users copy and paste into the prompt window.

Of course, platforms like ChatGPT are not foolproof. They do make mistakes, and users have found flaws and gaps in AI-generated code. As of 2023, ChatGPT-4 excels with small, specific chunks of code but breaks down when asked to write an entire application. It can deliver customized code, but only piecemeal. Developers still need to know what’s required and how it fits with the rest of their apps.

Platforms like ChatGPT could evolve into full-scale app development tools in the future. In many ways, AI is the ultimate Low-Code/No-Code platform. Users type in what they want the code to do and let the AI do the rest. Businesses will likely be able to function with small teams of developers who verify and implement it.

Greater Emphasis on Cybersecurity

One of the pitfalls of today’s Low-Code/No-Code platforms is a minimal ability to customize security features. The lack of visibility into the coding going on behind the scenes simplifies development but blinds developers to potential security risks. Additionally, people with no coding knowledge or experience using LC/NC approaches  may not be aware of important security features they should have or red flags to watch out for.

In the future, Low-Code/No-Code platforms will see more emphasis on cybersecurity. For example, the Online Worldwide Application Security Project (OWASP) has developed a framework of 10 key security protocols for Low-Code/No-Code apps. Developers can use it to learn about important security risks and features and how to address them in their development process.

The security options in Low-Code/No-Code platforms themselves will also grow in the years ahead. The global cost of cybercrime is expected to hit $11.5 trillion in 2023 and more than double that by 2027. There will be more demand for advanced security features as security threats grow. For example, developers might begin including AI threat-monitoring tools.

Clearer Intellectual Property Standards

Intellectual Property rights are a growing concern in coding and development, especially since AI can write functional code. When anyone can automate coding, who is really writing it? Who is the developer of new Low-Code/No-Code apps, and who has the IP rights to these programs and any profits made?

These questions must be resolved as Low-Code/No-Code platforms gain in popularity, particularly in the context of growing geopolitical complications surrounding IP rights. For instance, the war in Ukraine led Russia to implement a 0% license fee on IP content from “unfriendly countries” like the U.S. and European nations.

Code and apps can be subject to IP laws, not just content such as books and movies. Low-Code/No-Code platforms may soon be able to develop apps on the same level of customization and precision a professional developer could deliver, and the industry will need to decide who has the IP rights to these new apps—the people using the platforms, or those who designed them.

How Will Low-Code/No-Code Impact Developers?

Low-Code/No-Code technology’s role in the software development industry is also evolving. Everyone is wondering what the future holds for professional software developers today. The combination of AI and Low-Code/No-Code platforms leads many to wonder if they will become obsolete. While this will not happen anytime soon, the developer role is shifting.

Low-Code/No-Code platforms and AI like ChatGPT are tools, like any other technology. They can help developers do their jobs more efficiently and easily but cannot replace the expertise people can provide.

Resolving the skills shortage is one specific area where Low-Code/No-Code platforms will help developers. Coders and programmers are in high demand in all areas of computer science today.

For example, the shortage of cybersecurity professionals leaves many businesses ill-equipped to handle rising cybercrime rates. Similarly, over 37% of recruiters report struggling to find enough developers with the necessary skills for their businesses’ needs. However, young people continue to show a strong interest in computer science, indicating a growing talent pool.

Demand for software development skills continues to grow faster than the available talent pool can keep up with. Low-Code/No-Code platforms will help businesses fill those shortages. Smaller teams of developers can use them to work more efficiently and operate at the same level as a larger group.

Similarly, developers may not need to do much manual coding in the future. Their roles may shift toward designing, testing, and maintaining apps. Meanwhile, Low-Code/No-Code platforms and AI will do the bulk of the actual code-writing process. As a result, developers will be able to roll out apps faster and with less budget required.

Low-Code/No-Code Is Innovating Software Development

Low-Code/No-Code software development platforms are transforming how new apps, tools, and websites are created. Now anyone can get into software development, regardless of prior coding experience.

Low-Code/No-Code platforms will become more capable in the years ahead thanks to the advanced capabilities of AI models like ChatGPT. IP rights and cybersecurity will become important concerns as adoption grows. Professional developers will remain vital to the industry for the foreseeable future, although their roles will evolve to adapt to Low-Code/No-Code processes.

]]>