Security Archives | Datamation https://www.datamation.com/security/ Emerging Enterprise Tech Analysis and Products Fri, 13 Oct 2023 22:07:11 +0000 en-US hourly 1 https://wordpress.org/?v=6.3 6 Top Data Classification Trends for 2023 https://www.datamation.com/security/data-classification-trends/ Fri, 13 Oct 2023 22:00:00 +0000 https://www.datamation.com/?p=23613 Data classification—organizing data by relevant categories—should be a key part of an enterprise’s larger data management strategy. Tagging data can make it more searchable, and therefore more useful. It can also eliminate duplicate data, which reduces storage needs and expenses and improves the speed of queries and analytics. Misclassified data provides inaccurate results and can lead to security incidents when it is mistakenly made public because it was labeled incorrectly.

Historically, organizations were often lax about data classification, creating problems that compounded quickly and led to data sprawl, lost productivity, and security concerns. But as data becomes increasingly essential for business—and accumulates in massive volumes—organizations have begun to consider data classification a pillar of their data management efforts. Here are the six top data classification trends for 2023.

1. AI is Driving Data Classification Efforts

Artificial intelligence (AI) had a banner year in 2023, and data science—like most industries—has begun to reap the benefits. Legacy data classification systems required challenging implementations and lacked the ability to perform context-based classification, but new solutions use AI to incorporate content awareness and context analysis into classifying and sorting data.

AI powered automation in data classification can help companies analyze and label unstructured data at unprecedented scales, and with minimal human intervention. This allows organizations to classify more data more quickly. It also lets them circumvent the industry-wide qualified staffing shortage.

AI also provides data leaders with actionable visibility into how data is used, shared, and acted on by different users, making it easy to flag suspicious data.

2. More Data Regulations are Being Implemented and Enforced

As more and more data breaches come to light, especially in critical infrastructure, governments have begun to tighten their grip around tech companies that violate data management and localization principles. New data privacy laws abandon the harm-based approach—preventing and punishing violations of consumer data—in favor of a rights-based approach that gives individuals control of how their data is managed, used, and processed.

The European Union is currently undertaking its largest cross-border investigation under the General Data Protection Regulation (GDPR) and taking action against member states that allow data attacks to thrive. While the U.S. has historically had a more lenient approach toward how organizations collect and classify data, that might be changing—after passage of the watershed California Consumer Privacy Act (CCPA), other states including Colorado, Utah, and Virginia have pursued similar legislation.

Additional policies like the National Cybersecurity Strategy, Gramm-Leach-Bliley Act (GLBA), and Family Educational Rights and Privacy Act (FERPA) will create multiple federal regulators in the U.S. to oversee implementation of data governance policies, and assist with classification, usage, and archival of data in the entire data lifecycle management.

3. Better Technologies are Making Data Classification More Effective

Technology is fueling a new wave of data democratization, providing simpler access controls, more secure delivery, and greater decentralization. At the forefront is the integration of data fabric—which stitches together metadata to aid data classification—and data mesh, which can reduce information silos and aid in governance by putting the onus on teams that produce data.

The combination of technologies helps companies process data from multiple sources, producing faster insights and creating a frictionless web for all stakeholders to engage with processed data. It also helps build an autonomous, company-wide data classification and coverage interface that provides self-service access to fragmented datasets.

Enterprises can reduce operational expenses by up to 400 percent by classifying data without having to move it and creating a data abstraction layer. Enterprises can also manage their security postures with improved data access and intelligent query escalation, allowing them to build a top-down data service.

4. Zero-Trust Data Privacy Vaults are Being Used for Sensitive Data

Data classification plans must also secure confidential and restricted data by de-identifying critical datasets and exposing only the information needed to complete a task. As tech firms face greater compliance demands from regulators, privacy vaults are increasingly drawing attention as an interesting solution. A zero-trust vault eases personally identifiable information (PII) compliance concerns by providing a controlled environment to protect sensitive data.

Most privacy vaults use polymorphic encryption, two-factor authentication, and regular data audits to detect vulnerabilities and keep customer data attack-proof. They also allow governments and businesses to work together on privacy by design in big tech by redacting confidential datasets, tokenizing sensitive information, and restricting the flow of personal data in large language models (LLM) like ChatGPT.

Privacy vaults are especially popular in the pharmaceutical field, where proprietary research has to be protected across the drug lifecycle.

5. Unstructured Data is Powering Business Intelligence

Unstructured data—emails, text messages, and multimedia, for example—poses particular challenges for data classification. It is like the anti-matter of the universe in that it is difficult to detect and mostly impossible to analyze, but it accounts for a significant portion of the data enterprises collect and use.

The growing focus on unstructured data is driven by the time crunch that businesses face in a fiercely competitive market. They have to feed data pipelines faster, move only the data they need—and that has already been classified—and eliminate manual efforts to find classified datasets.

Finding ways to process and classify unstructured data can provide improved storage capacity, a data-driven way to measure consumer experience, and a better understanding of user sentiment.

Read our Comprehensive Guide to Data Pipeline Design.

6. Companies are Assessing Risks to Prevent Shadow Access

Shadow access—unintended, uninvited, and unnoticed access to datasets—is an increasingly exploited risk facing businesses with large volumes of poorly classified data. That risk is only expected to grow as more data gets stored and shared in the cloud.

About 80 percent of all data breaches occur because of existing credentials—employees intentionally or inadvertently share confidential information or access unauthorized applications and cloud services. With blurred lines between personal and professional domains and the growing complexity of cloud identity, shadow access has become an even thornier issue.

Because you can’t protect what you don’t know, new tools to assess risk for shadow access are garnering attention from data leaders. They allow them to identify data types that are vulnerable to security risks and take necessary steps to mitigate those risks.

Bottom Line: Enterprise Data Classification is Evolving

As enterprises race toward the creation of data-safe environments, their data classification policies will increasingly become one of the differentiating factors. At the moment, the field of data classification is in flux, driven by the advent of generative AI, a greater demand for customer experience, and growing pains of data sprawl. But organizations that tap into these innovations to shore up their data classification efforts and their larger data management strategies will ride the wave to a more successful, more secure, and more actionable data future.

Read The Future of Data Management to see other trends in how enterprises work with and keep tabs on mission critical information.

]]>
What is SOX Compliance? Requirements & Rules https://www.datamation.com/big-data/sox-compliance/ Wed, 04 Oct 2023 14:30:14 +0000 https://www.datamation.com/?p=21357 The Sarbanes-Oxley (SOX) Act is a milestone data compliance and disclosure law designed to protect investors by improving the accuracy and reliability of corporate disclosures and making corporate board members, managers, and accounting firms liable for the accuracy of their financial statements. IT plays a significant role in corporate compliance with the regulatory policies established by SOX, since related financial reports come from data housed on corporate systems and must be secured and maintained in a safe environment. This article explores the key contents of SOX, how companies can stay in compliance, and the benefits of regulatory enforcement.

What is SOX Compliance?

The SOX Act protections require companies to maintain a thorough, accurate knowledge of their financial data and upkeep their network security in all areas where financial data could be breached or misrepresented. Passed by the U.S. Congress in 2002 after several major fraud cases, including the Enron fraud scandal, SOX guards investors against faulty or misrepresented disclosures of publicly traded companies’ financial data.

At a high level, SOX mandates that companies do the following:

  • Prepare complete financial reports to ensure the integrity of financial reporting and regulatory compliance
  • Put controls in place to safeguard financial data and ensure its accuracy
  • Provide year-end financial disclosure reports
  • Protect employee whistleblowers who disclose fraud

SOX also requires CEOs, CFOs, and other C-suite executives to take responsibility for honest financial data reporting, formalized data security policies and procedures, and documentation of all relevant financial details—which can all be pulled up and reviewed via audit at any time. But SOX also puts pressure on IT teams, much like other government, regulatory agency, and jurisdictional compliance policies like the European Union’s General Data Protection Regulation (GDPR), through its data and reporting requirements.

Data-Specific Rules in SOX

SOX specifically regulates the financial data of publicly traded companies, especially as it relates to corporate transactions, which can include line items like off-balance sheet transactions, pro forma figures, and stock transactions. The law enacts several rules for these kinds of financial data, obliging companies to submit for regular external audits and enabling internal reporting and controls to support financial data accuracy.

Data management and archiving are essential to SOX. IT must create and maintain a data archive of corporate records that conforms to the management of electronic records provisions of SOX Section 802, which provide direction in three critical areas:

  • Retention periods for records storage are defined, as are SOX best practices for the secure storage of all business records
  • Definitions must be made for the various types of business records that need to be stored (e.g.,  business records, communications, electronic communications, etc.)
  • Guidelines must be in place for the destruction, alteration, or falsification of records and the resulting penalties

Beyond routine audits and maintenance of financial reporting, companies are expected to report concrete evidence of changes in financial condition to the SEC. The controls that SOX requires include an Internal Control Report, which details all financial history for managerial responsibility and transparency, as well as additional documentation that proves the regular monitoring of financial data.

The SEC also requires formal data security policies with proof of communication and enforcement across a corporate network. SOX does not provide exact security protocols or expectations.

SOX Compliance

SOX compliance falls into the category of corporate governance and accountability. While it’s mainly financial, it also involves enterprise IT departments as it includes very specific guidelines for how corporate electronic records must be stored and for how long—generally, for a minimum period of five years.

SOX directs all covered companies to undergo annual audits and make the results publicly available to their stakeholders. In order to pass a compliance audit for SOX, companies need to inspect the quality of their internal controls and systems in these four key areas:

  • Limiting physical and electronic access to only what authorized users absolutely need
  • Security measures with features like endpoint security, multi-factor authentication, and anti-malware have been set up and maintained to protect against breaches
  • Secure backup storage for all relevant financial data that could suffer from a breach
  • Change management and internal auditing practices to ensure financial data remains protected when users, devices, and programs change
  • Appropriate reporting cycles, report formats, and data content must be put into place with a documented review process for SOX reports

SOX Enforcement

Externally, SOX is enforced by the U.S. Securities and Exchange Commission, which established the Public Company Accounting Oversight Board to oversee, regulate, and discipline auditors who work with publicly traded companies under SOX.

All publicly traded companies with American shareholders are required to comply with SOX rules, including related boards, management, and accounting firms. The consequences for non-compliance can be fines, imprisonment or both.

SOX can also be applied to international companies in certain situations—like other data laws, such as GDPR, SOX applies to any publicly traded company that does business with American citizens, even if the business itself is not located in the United States.

SOX Benefits

SOX has ushered in a level of financial accountability and liability that makes it difficult for publicly traded companies to defraud or mismanage financials. It has improved corporate data governance and ethics and made financial responsibility both a management and a board-level mandate. SOX also delivers a number of additional benefits for IT.

More Widespread Acceptance

Traditionally, management has not always recognized the return on investment of IT projects, but SOX has changed that to some extent. For example, it may be easier to approve the purchase of data integration and cleaning software, additional data storage, or expensive security and activity monitoring software if it’s necessary to help the company stay SOX compliant. Similarly, IT policies that might have been viewed as unnecessary or ignored because they might delay project deliverables now must be documented for compliance.

Reduced Complexity

SOX forces the integration of systems, work processes, and data that might not otherwise be integrated. Many companies use multiple invoicing, purchasing, enterprise resource planning (ERP), and customer relationship management (CRM) systems that IT needs to support. To maintain compliance with SOX, those systems are more likely to be integrated and business processes and systems redesigned to make everything—including data—more seamless and uniform. This integration reduces system complexity for IT in both new application development and system maintenance.

Supplier Data Sharing

SOX can improve the quality of transactions and data sharing with suppliers. While IT has traditionally struggled to integrate internal systems with those of suppliers for data exchange, SOX elevates the issue of supplier data incompatibilities into a SOX narrative for uniform data standards. This can compel supplier audits and demands for change to better integrate supplier data with corporate systems.

Improved Data Quality

The need to conform to external regulatory requirements has placed the spotlight on clean and accurate data and reporting and highlighted the importance of high quality data—even if it means investing IT staff time and budget. Standardized, high quality data is now the goal of virtually every company; without it, it’s almost impossible to run analytic and automation technologies like artificial intelligence. SOX and other compliance regulations help facilitate this work.

SOX Challenges

Despite the benefits of compliance—not least of which is avoiding punishment and fines—companies face challenges in their ongoing efforts to meet SOX regulations, which can put burdens on multiple departments and teams. Here are some of the most common.

Lack of Expertise

Inadequate resources or internal SOX expertise can be a problem for many companies, especially new and/or smaller businesses. Compliance requires implementing appropriate controls to monitor each SOX-related process—for example, purchasing might implement a control so that only someone manager-level or higher can sign off on an order in excess of $1,000. If the existing purchasing system does not have that checkpoint built into it, unsigned invoices could slip through and create a material weakness for auditors or regulators to find.

Company Culture

Some company cultures are averse to having rules and regulations foisted upon them. For example, some  technology startups pride themselves on creativity, freedom, and innovation—such environments make it difficult to get management onboard with costly, time-consuming, and restrictive SOX initiatives.

Data Integration 

Just because SOX requires data integration and uniform data management doesn’t make the job of data integration any easier for IT—it will take time, money, and resources. Businesses that merge or go through acquisitions and subsequently have to blend disparate systems into a composite enterprise whole for SOX reporting, especially, may find the effort daunting.

Regulatory Changes

The regulatory environment is constantly changing, and companies need to keep up. When a SOX requirement changes, the typical chain of communication starts in a regulatory agency, trickles down to the legal staff, gets reviewed by management, and then finally makes its way to IT. The challenge comes in keeping delays from happening along the way so that IT has time to implement the changes before the deadline.

The Bottom Line: SOX Compliance and Enterprise IT

SOX compliance is a fact of life for publicly traded companies. IT plays a major role in assuring that SOX guidelines and requirements are met. While the burden is high—and so are the costs for not meeting it—the advantages of compliance are widespread and benefit the companies themselves, not just their investors. SOX compliance has also elevated the role of IT in enterprise businesses, giving it a seat at the table it did not necessarily have prior. As similar new data regulations start to take hold around the world, IT teams will continue to play an important role in helping businesses stay compliant.

Read about the future of data management to learn about how other trends and policies are shaping the way enterprise organizations work with data.

]]>
Data Governance vs. Master Data Management: Key Differences https://www.datamation.com/big-data/data-governance-vs-data-management/ Mon, 25 Sep 2023 21:16:57 +0000 https://www.datamation.com/?p=24638 Data is an invaluable resource for decision-making and strategic planning across industries, but the sheer volumes can strain resources and infrastructure if it’s not stored and managed properly. Enterprises often employ data governance and master data management strategies as part of a larger data management effort to meet that goal.

Data governance is a holistic approach to your data—at its core, it refers to the set of rules and permissions by which the data is orchestrated and accessed. Master data management (MDM) is the overlap of technology and business operations to enforce the policies of data governance, ensuring the uniformity, accuracy, and accessibility of data.

This article explores the differences between data governance and master data management, where they overlap, and how they can be used together to help businesses ensure their data is accessible, reliable, and secure.

Jump to:

The Challenge of Managing and Governing Data

The challenges of managing and governing data go beyond the cost of storage to retain the massive volumes that businesses have come to rely upon for decision-making, operations, and competitive advantage. They also include security—through policies like the European Union’s General Data Privacy Regulation (GDPR), the California Consumer Privacy Act (CCPA), and the Health Insurance Portability and Accountability Act (HIPAA), the regulatory landscape has redefined the baseline for adequate data protection and management practices.

Maintaining data quality is another concern. The accuracy and reliability of data are critical. If either suffers, the decisions they inform aren’t grounded in truth.

The intertwined concepts of data governance and master data management serve distinctive roles. Differentiating between them is essential to mastering your data landscape.

Learn the 10 Best Practices for Effective Data Management.

What is Data Governance?

Data governance is a holistic approach to the how, why, and by whom of the data in the care of your organization. At its core, data governance refers to the set of rules and permissions that determine how the data is handled and who gets to access it. These rules have an effect on the data’s availability, usability, integrity, and security.

Not just a technical endeavor, it’s also directly related to business processes and organizational culture. Think of data governance as a rulebook, dictating processes and policies to ensure the data is strictly accessed by authorized individuals and isn’t used in any way that could harm the organization, its stakeholders, or clients.

The Core Principles of Data Governance

While every organization’s data governance policies will depend upon its needs and goals, here are the core principles they should follow:

  • Data Quality—Ensures the data is maintained to be as accurate, reliable, and actionable as possible.
  • Data Access—Makes the data readily available to all who need access to it, creating a streamlined process for access requests and approval procedures.
  • Data Privacy—Protects the data and personal information of clients, employees, and stakeholders.
  • Data Protection—Maintains the security of the data from potential threats and malicious actors using security measures to protect data assets.
  • Regulatory Compliance—Ensures that the data handling methods meet legal or jurisdictional requirements.

Benefits of Data Governance

In addition to helping organizations improve the processes around the core principles listed above by improving quality, enhancing security, and ensuring regulatory compliance, data governance also helps in other areas of the business.

For example, more accurate and reliable data better informs decision-making at all levels, leading to better outcomes. It also increases the agility of the business—in fluctuating market conditions, organizations are able to pivot more readily when they have a solid grip on their data.

Data governance isn’t purely an IT strategy. It’s an organizational imperative that ensures data, in all its forms, is treated with the importance and security it entails.

What Is Master Data Management (MDM)?

Master data management (MDM) is the overlap of technology and business operations to enforce the policies of data governance, ensuring the uniformity, accuracy, and accessibility of the enterprise’s data. By consolidating the data, MDM ensures it’s consistently synchronized across the various systems, bridging data islands to create a seamless flow of up-to-date data and information.

MDM combines software and other tools, processes, and the mechanisms outlined by an organization’s data governance policies to ensure that master data is governed and coordinated across the enterprise with high accuracy and integrity.

Components of MDM

MDM isn’t a single entity, but a combination of multiple technical and operational components. Each of the following plays a role in the data management process:

  • Data Integration—Integrates incoming data from legacy systems, CRMs, ERPs, and external data feeds into a central repository to offer a comprehensive view.
  • Data Quality—Ensures that the consolidated data maintains its quality by removing duplicates, rectifying errors, and filling in gaps to ensure a complete set.
  • Data Governance—Controls who can access the data, how it’s categorized, and how often it gets updated.
  • Business Process Management—Updates data seamlessly as processes change (for example, during a product launch or merger).

Benefits of Master Data Management

Master data management helps organizations by creating a single, reliable repository of information. This means higher quality data that’s devoid of redundancies and inaccuracies. It also streamlines business processes by consolidating the data and providing a unified view to remove roadblocks or discrepancies and allow for smoother operations in the supply chain, finance, and consumer relations management.

With reliable access to data-backed decisions, business leaders can craft strategies that are grounded in reality and aligned with goals and aspirations. MDM can also increase operational agility by ensuring businesses are highly adaptable, using real-time data access to shift strategies and responses as needed.

Master data management is not just an IT tool, it’s a strategic imperative. As organizations grapple with growing data complexities, MDM stands out as the beacon, ensuring data coherence, quality, and agility.

Data Governance vs. Master Data Management: Key Differences

While largely interconnected, master data management and data governance serve different facets of the data ecosystem. Here’s a look at what sets them apart.

Purpose and Objectives

Data governance focuses primarily on policies ensuring data integrity, security, and overall quality, treating it as one of the organization’s valuable assets. MDM, on the other hand, zeroes in on the various fragments of the data and how each of them should be precisely managed for comprehensive and reliable access to the data around the clock.

Processes Involved

The policies created by data governance are enforced organization-wide, often taking into account regulatory compliance. MDM is more process-centric, delving into data integration, quality assurance, and aligning data with business operations.

Stakeholders and Responsibilities

Data governance includes data owners, stewards, users, and stakeholders. It takes their input, both directly and indirectly, as to how the data is going to be kept accurate and safe from privacy violations. MDM, on the other hand, is solely concerned with input from the IT and business operations, allowing them to integrate data sets into necessary processes, upholding data standards, and refining business processes.

Role of Technology in Data Governance and MDM

Technology stands at the forefront of efficient data governance and master data management, and is often the catalyst for their effectiveness.

Organizations have access to specialized tools and platforms, allowing them to amass large amounts of data from varied sources while maintaining its integrity and long-term usefulness. But beyond the collection and storage of data, businesses need technological solutions to help them process and dissect data and extract actionable insights in real-time.

Combined with the advancements of processing algorithms and machine learning, the automation of repetitive tasks within data governance and MDM has become more affordable, making it more accessible to smaller organizations with limited resources.

The use of technology in data governance and management doesn’t only streamline operations—it also significantly reduces the rates of human error. On the security front, technological solutions allow for robust encryption of the data, alongside a myriad of protective and mitigative measures. This ensures the data remains shielded from potential breaches and leaks. It’s further aided by monitoring and reporting software that keeps an eye on the data both in-rest and in-transit at all times.

Can Data Governance and Master Data Management Work Together?

The fusion of data governance and MDM can lead to maximized outcomes. This is most evident when considering the overlap between the need to set and enforce solid policies while managing the data for processing and accessibility. Data governance and MDM are not only complementary—they thrive when implemented in unison.

Combining both concepts allows organizations to come up with frameworks that emphasize best practices while championing clear roles that uphold data quality and integrity. Through such integration, organizations can foster a seamless and efficient data management strategy that leverages the strengths of both disciplines.

Bottom Line: Data Governance vs. Master Data Management

Data governance is about managing data as an asset, while MDM is about creating a single, unified view of data. Despite the unique approaches of data governance and master data management and their myriad differences, they’re closely related and perform better when combined.

By understanding the key differences and similarities between these two disciplines, you can make more informed decisions about your data strategy and achieve better outcomes that allow you to effectively manage and govern your data, refining raw material into valuable insights that drive business growth and innovation.

Read about the 10 Best Master Data Management Tools for 2023 to learn how enterprises are implementing their own MDM strategies.

]]>
National Cybersecurity Strategy: What Businesses Need to Know https://www.datamation.com/security/national-cybersecurity-strategy-what-businesses-need-to-know/ Mon, 25 Sep 2023 15:44:42 +0000 https://www.datamation.com/?p=24634 The National Cybersecurity Strategy (NCS) is a U.S. government plan to create a safe and secure digital ecosystem by protecting critical infrastructure—hospitals and clean energy facilities, for example—from cyberthreats, increase public/private partnerships and collaboration with international coalitions, and bolster its technology governance. The goal is to ensure that digital infrastructure is easier to defend than attack while making it safe and accessible for all Americans. A key part of the NCS is shoring up privacy efforts by increasing accountability for tech companies and other enterprises that deal with people’s data. This guide highlights what businesses need to know about the plan.

Why does the U.S. Need a National Cybersecurity Strategy?

The risk posed by cybersecurity threats is enormous, and the ramifications of targeted attacks are larger still. At the individual level, data breaches can cause identity theft and loss of income; at the corporate level they can disrupt business continuity, damage reputations, and steal intellectual property; and at the government level, they can cripple agencies, shut down power grids, and cut off communications networks. 

The National Cybersecurity Strategy is a government effort to expand public/private partnerships, shore up cybersecurity defenses and alliances, and protect networks, systems, functions, and data while continuing to promote tech innovation. Some of the goals of the plan include the following:

  • Simplifying threat reporting 
  • Creating a first-touch response to cyberattacks
  • Developing timelines and execution methods
  • Allocating resources and mapping responsible government agencies 
  • Incentivizing cyber hygiene
  • Improving public-private partnership 

Recent years have shown an increase in state-sponsored cyberattacks—a 300 percent growth from 2000 to 2022, according to government data. For enterprises, the average financial cost of a ransomware attack is already over $4.5 million, and those attacks are only getting more sophisticated.

Learn more about top data security software and solutions.

What is the National Cybersecurity Strategy?

The NCS is a five-pillar action plan to ramp up cybersecurity efforts and bring all stakeholders together to ensure success. A solid national cybersecurity policy is essential to building on the promise of emerging technologies while minimizing the risks they pose.

Pillar One: Defend Critical Infrastructure

Defending critical infrastructure, including systems and assets, is vital for national security, public safety, and economic prosperity. The NCS will standardize cybersecurity standards for critical infrastructure—for example, mandatory penetration tests and formal vulnerability scans—and make it easier to report cybersecurity incidents and breaches. 

It seeks to label Infrastructure as a Service (IaaS) providers as a “critical infrastructure,” putting more of the onus on them to ensure data security and protection and using legal accountability to eliminate insecure software products and unpatched vulnerabilities. It will also implement the zero trust cybersecurity model for federal networks.

Pillar Two: Disrupt and Dismantle Threat Actors

Once the national infrastructure is protected and secured, the NCS will go bullish in efforts to neutralize threat actors that can compromise the cyber economy. This effort will rely upon global cooperation and intelligence-sharing to deal with rampant cybersecurity campaigns and lend support to businesses by using national resources to tactically disrupt adversaries.

Components of this pillar include building awareness about threat actors, ransomware, IP theft, and other malicious attacks and creating a Cyber Safety Review Board (CSRB) to review catastrophic incidents and strategize based on repeated attack patterns. It will also implement new guidelines for already-impacted industries—manufacturing, energy, healthcare, and public sectors, for example—and new software bill-of-materials standards to lower supply chain risks.

Pillar Three: Shape Market Forces to Drive Security and Resilience

As the world’s largest economy, the U.S. has sufficient resources to lead the charge in future-proofing cybersecurity and driving confidence and resilience in the software sector. The goal is to make it possible for private firms to trust the ecosystem, build innovative systems, ensure minimal damage, and provide stability to the market during catastrophic events. 

The priority plan under this stage includes efforts to protect privacy and personal data security by creating federal grants to encourage investments in secure infrastructure and investing in cyber insurance initiatives to help private firms recover from high-scale attacks. It will also implement an Internet of Things (IoT) security labeling program to improve consumer awareness of IoT device risks.

Pillar Four: Invest in a Resilient Future

To aggressively combine innovation with security and forge an impregnable shield against the growing number of cybercrimes, the government has earmarked funds to secure next generation technology while ensuring necessary tech transfer and information dissemination between private and public sectors. The NCS will put a special impetus on data discovery, protection architecture, and encryption in all government to business communications.

This pillar also includes cybersecurity apprenticeships and a National Cyber Forensics and Training Alliance to train the workforce and improve cyber literacy, and the deployment of a unique digital identity authentication to thwart phishing attacks and create a trusted digital identity framework (TDIF).

Pillar Five: Forge International Partnerships to Pursue Shared Goals

Global leaders are learning that cyber diplomacy is the most forthcoming strategy to turn adversaries to allies. With pillar five, the government will commit to continue global initiatives against digital warfare and build a trust surplus among allies.

Among the ways it hopes to accomplish this is by creating a centralized tracker for coordinating cost-sharing initiatives, creating secure and dependable global supply chains, and strengthening partner nations’ capacities to shield themselves against cyberthreats. It will also establish a threat intelligence infrastructure to collaborate with allies and global agencies. 

Learn how to develop an IT security policy.

What Do Businesses Need to Know about the NCS?

Businesses will have to change some of their thinking around cybersecurity under the NCS. It makes the point that voluntary progress toward better cybersecurity and data privacy practices are no longer sufficient, and maybe weren’t working at all. More than that, the government will implement new standards and regulatory frameworks and shift liability to hold enterprises accountable for not doing their part. It will also incentivize cybersecurity best practices.

Here are the three main actions businesses will be pushed to take by the NCS:

  1. Identify and minimize vulnerabilities by taking proactive measures to test and secure their threat landscape and holding partners and third-party vendors to similar cybersecurity standards.
  2. Address supply chain vulnerabilities by sharing information through new public/private partnerships, patching known vulnerabilities, providing employee cybersecurity training, and designing critical incident response plans.
  3. Put cybersecurity front-of-mind when developing software, processes, products, and networks to protect privacy and data—the NCS makes it clear that it expects businesses to take on more responsibility and will seek to enforce it. 

Bottom Line: Enterprise Changes in the NCS

The National Cybersecurity Strategy is the U.S. government’s first cybersecurity initiative in 15 years. As such, it’s a living document, an ever-evolving blueprint to build cyber-resilience and protect the U.S. and its allies from threats. More than just filling gaps, it ambitiously seeks to pave the way to a strong, equitable, and inclusive cyber future. Businesses of all sizes will have to play a role in its rollout and will be essential to its success, but it targets enterprises especially—the stakes are higher, the resources are more plentiful, and their responses have the potential to serve as frameworks and best practices for smaller businesses to follow. 

Keeping data secure is just one component of an effective data management strategy. Learn the 10 best practices for data management to make sure your business has its own data efforts under control.

 

]]>
A Guide to the Most Common IoT Protocols and Standards 2023 https://www.datamation.com/edge-computing/iot-protocols-and-standards/ Tue, 22 Aug 2023 17:11:57 +0000 https://www.datamation.com/?p=24505 Internet of Things (IoT) devices are seemingly everywhere, from the mobile phones in our pockets and the smart thermostats and doorbell cameras in our homes to the manufacturing facilities where they were made. Protocols and standards ensure that these devices can function correctly and communicate with one another, generating the data that makes them so useful. Here’s a look at the most common IoT protocols and standards.

What are IoT Protocols and Standards?

IoT protocols are established rules about how IoT devices should work and communicate. Standards are similar to protocols, but are used more widely—across an entire industry, for example. Together they ensure that all IoT devices have a minimum level of compatibility with one another and with other related devices and applications.

For instance, a manufacturer might use two different IoT sensors from different brands. As long as both companies follow the same guidelines, the sensors will work on the same network. IoT protocols and standards typically function in a single layer as a distinct part of a larger network—most commonly in the application and middleware layers of a standard five-layer network architecture, although not exclusively. For example, Bluetooth and Wi-Fi operate on the network layer.

Diagram of standard five-layer network architecture via Dr. João Pedro Reis.
Image: Diagram of standard five-layer network architecture via Dr. João Pedro Reis.

Commercial IoT Standards and Protocols

Commercial IoT is a huge, still-growing industry. Interest in smart home tech is creating high demand for devices in the consumer electronics market. As a result, protocols and standards are emerging to ensure consumers get a streamlined, user-friendly experience. While some of these standards are also used in industrial applications, their biggest benefits stand out most in commercial settings. A few commercial IoT standards and protocols are so widely used they have become ubiquitous—like Bluetooth and Wi-Fi, for example.

Bluetooth

It’s hard to imagine consumer electronics today without the Bluetooth standard for wireless device-to-device communication. Every new smartphone, tablet, and laptop includes Bluetooth support as a standard feature.

Bluetooth was one of the first IoT communication protocols to open the door for a boom in consumer IoT devices, such as smartwatches and wireless headphones. It uses wireless personal area networks (WPANs), allowing for short-range data transmission using radio waves.

Bluetooth was originally standardized by the world’s largest technical professional organization, the IEEE, in 2005 under standard IEEE 802.15.1. Though updates ceased in 2018, Bluetooth remains an extremely popular IoT protocol—particularly among consumer electronics.

Data Distribution Service (DDS)

The Data Distribution Service (DDS) protocol and standard is designed for communication across hardware and software platforms. Its main benefits include easy scalability, high reliability, and low-latency connectivity. DDS is great for ensuring all the IoT components in a system can maintain high-quality data transfers.

DDS is popular across commercial and industrial IoT applications. Originally published in 2004 by the Object Management Group, which maintains it today, it is a middleware protocol for standardizing machine-to-machine communication using the publisher-subscriber model.

Diagram of DDS Scaling, via DDS Foundation/Object Management Group Inc.
Image: Diagram of DDS Scaling, via DDS Foundation/Object Management Group Inc.

Matter

Matter is a communication and interoperability standard designed to address the issue of smart home device communication between brands. Many commercial device manufacturers want consumers to buy all their smart home devices from one brand. This isn’t necessarily in the consumer’s best interest, but poor communication between products from different companies may force them to pick a single brand.

Matter ensures that smart home devices from participating manufacturers work together natively. It benefits both manufacturers and consumers. Since companies don’t have to be a one-stop shop, they can instead focus on making great smart thermostats, for example, without worrying about losing money to a competing brand that also makes other products.

Wi-Fi

Wi-Fi is among the oldest IoT standards and one of today’s most well-known and widely used. Its invention dates back to 1942, when actress and inventor Hedy Lamarr patented frequency hopping. It evolved over the decades until the first WiFi standard was created in 1997.

This first set of standards established the Wi-Fi we know today. The IEEE 802.11 family of standards outlines how communication over wireless local area networks (WLANs) should work. It also establishes a minimum data transfer speed of 2 megabytes per second. The IEEE continues to maintain the 802.11 standards, and Wi-Fi is still found in most consumer electronics and commercial IoT devices, such as smart home appliances and sensors.

XMPP

Extensible Messaging and Presence Protocol (XMPP) was originally developed for human-to-human communication in 2002. In the 20-plus years since, it has evolved into a machine-to-machine communication protocol popularly used by smart appliances.

Today, XMPP is an open-source protocol maintained by the XMPP Standards Foundation. It’s a lightweight middleware system that standardizes communication and XML data. XMPP runs in the application layer, where it can provide near-real-time data transfers. This responsiveness, combined with XMPP’s high accessibility, makes it ideal for communicating with smart home devices like appliances.

Industrial IoT Standards and Protocols

The industrial IoT market is among the strongest-performing in the world, which should come as no surprise given the countless applications of IoT in manufacturing, logistics, and construction. Industrial IoT (IIoT) is considered its own distinct niche.

IIoT standards and protocols are becoming increasingly important as businesses grow to rely on their IoT devices more. For instance, a manufacturer in a smart factory might use IIoT sensors to send maintenance alerts, which could affect employee safety. IoT communication standards ensure sensors send real-time alerts successfully, regardless of the brand or model.

Constrained Application Protocol (CoAP)

Constrained Application Protocol, or CoAP, is a protocol that allows IoT devices to use HTTP without excessive power consumption. Launched in 2013, it’s  popular for machine-to-machine (M2M) communication—particularly in industrial applications like supply chain environments.

CoAP lets industrial users include a wider variety of IoT devices in their networks without being restricted by low power capabilities or bandwidth. Its main drawback is a lack of security features. CoAP is somewhat exposed on its own and needs the additional datagram transport layer security (DTLS) protocol to ensure secure data transmission.

Lightweight M2M (LWM2M)

Lightweight M2M, or LWM2M, is a protocol specifically for remote device management in IoT or machine-to-machine environments. It is purpose-built for IoT sensors, making it a highly useful protocol for industrial applications. Its light weight means it doesn’t require much power, storage, or computing resources to run.

LWM2M was originally published in 2017 and is still active and maintained by OMA SpecWorks. The 2020 update to the protocol added compatibility with edge networking and 5G, making LWM2M a cutting-edge standard for today’s industrial environment. LWM2M works over TCP/TLS, MPTT, and HTTP.

MQTT

MQTT is an application-layer protocol for machine-to-machine communication using the publisher-subscriber model. It was developed in 1999 and is a popular open-source protocol for standardizing communication between industrial IoT devices.

MQTT is particularly well-suited for IIoT sensors due to its lightweight nature and tolerance for low bandwidth. Since it doesn’t require much memory space, MQTT is highly compatible with the full range of IIoT devices. It essentially acts as a bridge to applications.

Zigbee

Zigbee is a highly popular network protocol specifically for mesh networks used in automation. Consumer and industrial devices use Zigbee, although its emphasis on automation and various applications makes it ideal for business. It was developed by the Connectivity Standards Alliance, which also created Matter.

Zigbee’s top benefits include low power consumption and a high degree of flexibility. It’s designed for short range, similar to Bluetooth. One feature that’s particularly beneficial in the industrial space is its high level of security. Zigbee includes encryption and authentication by default while staying lightweight. This means industrial users can build a mesh network of IoT devices with security features without using excessive power and computing resources.

Security IoT Standards and Protocols

Cybersecurity standards have always played an important role in the IoT’s development and growth. Some communication-related protocols include security features, but this isn’t always the case. A growing pool of IoT protocols and standards is designed to emphasize cybersecurity. Some of these are add-on rulesets for other offerings—for instance, Wi-Fi Protected Access 2 is one of today’s leading network security protocols to add to Wi-Fi.

Ascon (NIST)

Ascon is the National Institute of Standards and Technology’s (NIST) official standard for IoT encryption, selected in 2023. It is now the formal standard in the U.S. for securing IoT devices and communications.

Ascon is a collection of cryptographic algorithms that provide highly secure encryption without requiring high amounts of power and computing. Implementing Ascon can help IoT device manufacturers be more proactive about preventing cyberattacks and vulnerabilities rather than just responding to them.

DTLS

Datagram Transport Layer Security, or DTLS, is a security protocol for encrypted communications. A datagram is a standard data transfer unit, such as a single message—they are commonly used in gaming, streamed video, or videoconferencing applications.

Designed by the Internet Engineering Task Force, DTLS secures wireless communications so senders and receivers know their messages won’t be intercepted or spied on. It’s a commonly used protocol across commercial and industrial spaces.

Z-Wave

Z-Wave is a proprietary alternative to protocols like Bluetooth and Wi-Fi designed for encrypted mesh network communications, offering more security than its open-source counterparts. It functions on various low-level radio frequencies.

Z-Wave is popular among smart home automation systems, particularly those focusing on security. It is primarily used in consumer electronics and commercial applications but can also be used in industrial environments.

Bottom Line: Understanding IoT Protocols and Standards

IoT devices are a common part of people’s lives. They’re in our homes, our doctors’ offices, our oceans and skies, and businesses increasingly rely on them for a wide range of purposes. Day in and day out, these devices generate massive volumes of data used for business intelligence, competitive analysis, more efficient manufacturing, consumer feedback, and more. Dozens of protocols and standards run in the background to ensure that these devices and sensors work smoothly and securely and can communicate with each other effectively—understanding these protocols can help enterprises make better purchase decisions and build more secure, robust IoT networks.

Read next: Top 7 IoT Analytics Platforms

]]>
7 Data Management Trends: The Future of Data Management https://www.datamation.com/big-data/data-management-trends/ Wed, 02 Aug 2023 18:40:52 +0000 https://www.datamation.com/?p=21484 Data management trends are coalescing around the need to create a holistic framework of data that can be tapped into remotely or on-premises in the cloud or in the data center. Whether structured or unstructured, this data must move easily and securely between cloud, on-premises, and remote platforms, and it must be readily available to everyone with a need to know and unavailable to anyone else.

Experts predict 175 zettabytes of data worldwide within two years, much of it coming from IoT (Internet of Things) devices. Companies of all sizes should expect significant troves of data, most of it unstructured and not necessarily compatible with system of record (SOR) databases that have long driven mission-critical enterprise systems like enterprise resource planning (ERP).

Even unstructured data should be subject to many of the same rules that govern structured SOR data. For example, unstructured data must be secured with the highest levels of data integrity and reliability if the business is to depend on it. It must also meet regulatory and internal governance standards, and it must be able to move freely among systems and applications on clouds, internal data repositories, and mobile storage.

To keep pace with the enormous demands of managing voluminous high velocity and variegated data day-in and day-out, software-based tools and automation must be incorporated into data management practices. Newer automation technologies like data observability will only grow in importance, especially as user citizen development and localized data use expand.

All of these forces require careful consideration as enterprise IT builds its data management roadmap. Accordingly, here are seven emergent data management trends in 2023.

Hybrid End-to-End Data Management Frameworks

Enterprises can expect huge amounts of structured and unstructured data coming in from a wide range of sources, including outside cloud providers; IoT devices, robots, drones, RF readers, and MRI or CNC machines; internal SOR systems; and remote users working on smart phones and notepads. All of this data might be committed to long- or short- term storage in the on-premise data center, in a cloud, or on a mobile or distributed server platform. In some cases, real-time data may need to be monitored and/or accessed as it streams in real time.

In this hybrid environment, the data, its uses, and its users are diverse—data managers will need data management and security software that can span all of these hybrid activities and uses so data can be safely and securely transported and stored point to point.

IBM is a leader in the data management framework space, but SAP, Tibco, Talend, Oracle, and others also offer end-to end data fabric management solutions. A second aspect of data management is being able to secure data, no matter where it is sent from or where it resides—end-to-end security mesh software from vendors such as Fortinet, Palo Alto Networks, and Crowdstrike can meet this need.

The Consolidation of Data Observability Tools

Because many applications now use multiple cloud and on-premises platforms to access and process data, observability—the ability to track data and events across multiple platform and system barriers with software—is a key focus for enterprises looking to monitor end-to-end movements of data and applications. The issue with most organizations that are using observability tools today is that they are using too many different tools to effect end-to-end data and application visibility across platforms.

Vendors like Middleware and Datadog recognize this and are focused on delivering integrated, “single pane of glass” observability tool sets. These tools enable enterprises to reduce the number of different observability tools they use into a single toolset that’s able to monitor data and event movements across multiple cloud and on premises systems and platforms.

Master Data Management for Legacy Systems

As businesses move forward with new technologies, they face the challenge of figuring out what to do with older ones. But some of those continue to provide value as legacy systems—systems that are outdated or that continue to run mission-critical functions vital to the enterprise.

Some of these legacy systems—for example, enterprise resource planning (ERP) systems like SAP or Oracle—offer comprehensive, integrated master data management (MDM) toolsets for managing data on their cloud or on-premises solutions. Increasingly enterprises using these systems are adopting and deploying these MDM toolsets as part of their overall data governance strategies.

MDM tools offer user-friendly ways to manage system data and to import data from outside sources. MDM software provides a single view of the data, no matter where it resides, and IT sets the MDM business rules for data consistency, quality, security, and governance.

Data Management Using AI/ML

While the trend of using artificial intelligence and machine learning (AI/ML) for data management is not new, it continues to grow in popularity driven by big data concerns as the unprecedented volume of data enterprises are faced with managing collides with an ongoing staffing shortage across the tech industry as a whole—especially in data-focused roles.

AI and ML introduce highly valuable automation to manual processes that have been prone to human error. Foundational data management tasks like data identification and classification can be handled more efficiently and accurately by advanced technologies in the AI/ML space, and enterprises are using it to support more advanced data management tasks such as:

  • Data cataloging
  • Metadata management
  • Data mapping
  • Anomaly detection
  • Metadata auto-discovery
  • Data governance control monitoring

As AI/ML continues to evolve, we can expect to see software solutions that offer intelligent, learning-based approaches including search, discovery, and capacity planning.

Prioritizing Data Security

In the first quarter of 2023, over six million data records were breached worldwide. A data breach can destroy a company’s reputation, impact revenue, endanger customer loyalty, and get people fired.This is why security of all IT—especially as more IT moves to the edge and the IoT—is an important priority for CIOs and a major IT investment area.

To meet data security challenges, security solution providers are moving toward more end-to-end security fabric solutions. They are offering training for employees and IT, since increases in user citizen development and poor user security habits can be major causes of breaches.

Although many of these security functions will be performed by the IT and network groups, clean, secure, and reliable data is foremost a database administrator, data analyst, and data storage concern as well.

Automating Data Preparation

The exponential growth of big data volumes and a shrinking pool of data science talent is stressing organizations. In some cases, more than 60 percent of expensive data science time is spent cleaning and preparing data.

Software vendors want to change this corporate pain point with an increase in data preparation and cleaning automation software that can perform these tedious, manual operations. Automated data preparation solutions ingest, store, organize, and maintain data, often using AI and ML, and can handle such manually intensive tasks as data preparation and data cleansing.

Using Blockchain and Distributed Ledger Technology

Distributed ledger systems enable enterprises to maintain more secure transaction records, track assets, and keep audit trails. This technology, along with blockchain technology, stores data in a decentralized form that cannot be altered, improving the authenticity and accuracy of records related to data handling. This includes financial transaction data, sensitive data retrieval activity, and more.

Blockchain technology can be used in data management to improve the security, shareability, and consistency of data. It can also be used to provide automatic verification, offering avenues to improve data governance and security.

Bottom Line: The Future of Data Management

As businesses confront the need to collect and analyze massive volumes of data from a variety of sources, they seek new means of data management that can keep pace with the expanding need. Cutting edge technologies like AI/ML and blockchain can be used to automate and enhance some aspects of data management, and software vendors are incorporating them into their platforms to make them an integral part of the work. As new technologies continue to evolve, data management methods will evolve with them, integrating them into processes driven by increasing demand.

Read next: Structured Data: Examples, Sources, and How it Works

]]>
Companies Hiring for Cybersecurity Jobs https://www.datamation.com/careers/cybersecurity-jobs/ Fri, 21 Jul 2023 17:30:16 +0000 https://www.datamation.com/?p=22324 Cybersecurity has become an increasingly critical component of doing business in the digital age, and those with the expertise to help companies guard against the relentless threat and mitigate damages from attacks have no shortage of opportunities. As of July 2023, LinkedIn currently features more than 205,600 cybersecurity-related job postings paying a median salary of $147,895 for cybersecurity engineers and $90,493 for systems engineers.

While postings change frequently, this article highlights six companies currently recruiting for cybersecurity roles to provide some insight into the types of jobs that are available and the benefits they offer.

Microsoft icon

Microsoft

Microsoft Corporation is a widely known provider of software, hardware, and cloud solutions, and the most valuable publicly traded company in the world. Its products and software are widely known, and include the Windows operating system, the Microsoft Office suite, the Azure cloud platform, and gaming consoles.

Employees on Glassdoor rated Microsoft four out of five stars, and 87 percent said they would recommend working there.

Best Perks

Employees at Microsoft have many benefits including a health savings account (HSA), disability, health, vision, dental, life, and accidental death and dismemberment (AD&D) insurance; childcare/babysitting reimbursement, maternity and paternity leave, and adoption assistance; sick time, gym/wellness reimbursement, free lunches, and paid time off (PTO); relocation bonus, remote work, and immigration assistance; Employee Stock Purchase Program (ESPP), student loan repayment plan, 401k, and flexible spending account (FSA); tuition reimbursement, donation match, volunteer time off, and family sickness leave.

Headquarters

Microsoft is headquartered in Redmond, Washington.

Salary and Role Examples

  • Senior Offensive Security Engineer, Penetration Testing: $112,000 to $238,600
  • Security Analyst II, Data Science: $94,300 to $198,600
  • Senior Platform Engineer, Customer Security and Trust: $112,000 to $238,600
  • Senior Director, End-to-End Security Product Marketing Manager: $152,300 to $292,200
  • Principal Firmware Security Engineer, Software Engineering: $133,600 to $282,200

Learn more: Top Trends For Cybersecurity Jobs

Palo Alto Networks icon

Palo Alto Networks

Palo Alto Networks is a cybersecurity company that provides firewall and cloud security solutions for businesses. The company offers next-generation security technologies and threat intelligence to protect networks from cyber threats and ensure the safety of data and infrastructure.

Employees on Glassdoor rated Palo Alto Networks four out of five stars, and 82 percent said they would recommend working there.

Best Perks

Palo Alto provides benefits such as student loan resources, HSA, gym/wellness reimbursement, and unlimited PTO; health, life, disability, vision, dental, pet, and AD&D insurance; phone bill reimbursement, adoption assistance, fertility assistance, and business travel insurance; immigration assistance, 401k, FSA, and ESPP; Roth IRA, child and elder care, legal protection, and auto and home insurance.

Headquarters

Palo Alto Networks is headquartered in Santa Clara, California.

Salary and Role Examples

  • Attack Surface Data Analyst, Xpanse: $74,400 to $120,350
  • Security Researcher, DNS Security: $139,400 to $225,500
  • Systems Engineer, Strategic Accounts: $187,500 to $257,850
  • Sr. Professional Services Consultant, NGFW: $131,300 to $180,500
  • System Engineer, Major Accounts: $187,500 to $257,850

CrowdStrike icon

CrowdStrike

CrowdStrike is a cybersecurity company that specializes in endpoint security. Its cloud-based platform combines artificial intelligence (AI) and machine learning (ML) to identify and patch cyber attacks as they happen, helping businesses secure their data and infrastructures.

Employees on Glassdoor rated CrowdStrike over four out of five stars, and 81 percent said they would recommend working there.

Best Perks

CrowdStrike offers many benefits to employees including health, life, disability, vision, dental, and AD&D Insurance; ESPP, 401k, and parental and fertility assistance; remote-friendly work, health and wellness programs, professional development, and executive coaching/mentorship.

Headquarters

CrowdStrike is headquartered in Austin, Texas.

Salary and Role Examples

  • Technical Support Engineer, Operations: $65,000 to $110,000
  • Analyst I, Falcon Complete: $80,000 to $115,000
  • Sr. Software Engineer, Data Infrastructure Engineering: $105,000 to $195,000
  • Systems Administrator, Executive Support: $80,000 to $115,000
  • Principal Consultant, Red Team: $125,000 to $185,000

Fortinet icon

Fortinet

Fortinet is a cybersecurity company that offers many network security products and services, including firewall, VPN, intrusion detection and prevention, and threat protection for businesses of all sizes.

Employees on Glassdoor rated Fortinet four out of five stars, and 86 percent said they would recommend working there.

Best Perks

Fortinet offers many benefits including health, dental, life, vision, disability, and accident Insurance; HSA, FSA, and mental health care; 401k, stock options, ESPP, and retirement plans; work from home, maternity/paternity leave, and family medical leave; PTO, volunteer time off, and paid holidays.

Headquarters

Fortinet is headquartered in Sunnyvale, California.

Salary and Role Examples

  • Sr. Cybersecurity Technology Communications Manager, PR: $100,000 to $160,000
  • Senior Software DevOps Engineer, DevOps: $110,000 to $150,000
  • Senior Security Researcher, IPS Analysis: $120,000 to $160,000
  • Cloud Solutions Architect, Solutions Architecture: $200,000 to $240,000
  • Senior Product Security Engineer, Product Security Engineering: $150,000 to $200,000

Learn more: Today’s Cybersecurity Job Market

Tenable icon.

Tenable

Tenable is a cybersecurity company that focuses on vulnerability management and cyber exposure solutions. Its products help companies identify, assess, and prioritize security to provide effective risk management and proactive security measures.

Employees on Glassdoor rated Tenable four out of five stars, and 61 percent said they would recommend working there

Best Perks

Tenable provides benefits including health, vision, dental, life, disability, and AD&D insurance; HSA, FSA, 401k, and retirement plans; work from home, maternity/paternity leave, adoption assistance, and family medical leave; PTO, company events, and paid holidays.

Headquarters

Tenable is headquartered in Columbia, Maryland.

Salary and Role Examples

  • Manager Endpoint Vulnerability Management, Digital Solutions: $112,000 to $149,333
  • Staff Software Engineer, Engineering: $142,000 to $189,333
  • Staff Site Reliability Engineer, Engineering: $142,000 to $189,333
  • IT Asset Manager, Digital Solutions: $34.13 to $45.67/per hour
  • Territory Account Manager, Sales: $112,000 to $149,333

Proofpoint icon

Proofpoint

Proofpoint is a cybersecurity company that specializes in email security, loss prevention, and security threat protection. It provides companies with protection against phishing attacks, malware, and other data threats, and offers data encryption and other services to ensure data security.

Employees on Glassdoor rated Proofpoint four out of five stars, and 76 percent said they would recommend working there.

Best Perks

Proofpoint provides benefits including health, vision, dental, life, disability, and AD&D insurance; HSA, FSA, 401k, and stock purchase plans; work from home, maternity/paternity leave, adoption assistance, and family medical leave; PTO, company events, and paid holidays.

Headquarters

Proofpoint is headquartered in Sunnyvale, California.

Salary and Role Examples

  • Senior Data Scientist, ML Lab: $157,650 to $231,220
  • Director of Engineering, Shared Services: $168,080 to $319,550
  • Senior Software Engineer, SaaS Product: $117,600 to $231,220

Read next: 10 Top Cybersecurity Predictions for 2023

]]>
Report: Most U.S. Companies Say Data Infrastructure Falling Short https://www.datamation.com/big-data/report-most-u-s-companies-say-data-infrastructure-falling-short/ Tue, 18 Jul 2023 19:33:20 +0000 https://www.datamation.com/?p=24395 Nearly 60 percent of U.S. companies are overwhelmed by the amount of data they manage and 76 percent are concerned their current infrastructure will be unable to scale to meet upcoming demands. That’s according to a new survey from Hitachi Vantara, the modern infrastructure, data management, and digital solutions subsidiary of Hitachi, Ltd. 

New and exciting data-intensive technologies and applications like generative AI are spurring a goldrush to greater insights, automation, and predictability, but these technologies are simultaneously exacerbating the already-strained infrastructure and hybrid cloud environments on which they run.

Key Findings

Hitachi Vantara conducted a global survey of 1,288 C-level executives and IT decision makers, including 308 in the United States, to quantify the extent to which organizations are struggling to manage their data infrastructure in a secure and sustainable way. Its newly published Modern Data Infrastructure Dynamics Report reveals that most companies expect their data needs to nearly double in the next two years, and that both protecting and managing that rapid growth of data in an actionable and sustainable way are further complicating efforts.

Key U.S. findings include:

  • Respondents say that data is their most valuable asset but are concerned about the security and resilience of their data infrastructure; 71 percent are concerned they cannot detect a data breach in time to protect data.
  • 65 percent of respondents are concerned over whether their organization’s data infrastructure is resilient enough to recover data from ransomware attacks.
  • 27 percent of respondents admitted that important data was not backed up and 35 percent had experienced data inaccessibility due to storage outages.
  • 66 percent of IT leaders currently measure their data center’s energy consumption; however, 32 percent acknowledged that their data infrastructure uses too much energy and nearly half admitted their sustainability policies don’t address the impact of storing unused data.

The study also shed light on the future of data storage, with data spread across the hybrid cloud model leveraging a mix of public/private cloud, co-location and on premises expected to persist. For U.S. business leaders, the study found data stored in an already established hybrid cloud with percentages of data center workloads located almost evenly between the public cloud (24 percent), private cloud (25 percent), on premises (25 percent), and co-located/managed services (22 percent). Notably, the percentages were expected to largely stay the same in the coming two years. Download the full report here

Read next: Big Data Use Cases

]]>
The Top Intrusion Prevention Systems https://www.datamation.com/trends/top-intrusion-prevention-systems Wed, 14 Jun 2023 16:37:52 +0000 https://www.datamation.com/?p=24273 Cyber threats pose significant risks to organizations of all sizes, making robust security measures imperative. An intrusion prevention system (IPS) is one critical component in an organization’s cybersecurity arsenal, acting as a vigilant gatekeeper to actively monitor network traffic and prevent unauthorized access and malicious attacks. Choosing the right IPS can depend on everything from whether it is network-based or hosted to how well it integrates with existing systems and how much it costs.

We’ve rounded up the best intrusion prevention systems to help make the selection process less daunting. Here are our top picks:

Top Intrusion Prevention System Comparison At-a-Glance

Here’s a look at how the top IPSs compared based on key features.

Real-Time Alerts Integration with Other Security Systems Type of Intrusion Detection Automatic Updates Pricing
Cisco Secure Next-Generation Intrusion Prevention System Yes Yes Network-based Yes On-contact
Fidelis Network Yes Yes Network-based Yes 15-day free trial
Palo Alto Networks Threat Prevention Yes Yes Network-based and host-based Yes Free trial
Trellix Intrusion Prevention System Yes Yes Network-based and host-based Yes On-contact

Jump to:

  1. Key Intrusion Prevention System Features
  2. How to Choose an IPS
  3. Frequently Asked Questions (FAQs)

Cisco icon

Cisco Secure Next-Generation Intrusion Prevention System

Best for comprehensive network security

Cisco offers advanced threat protection solutions with Cisco Secure IPS. This cloud-native platform offers robust security with unified visibility and intuitive automation. It gathers and correlates global intelligence in a single view and can handle large traffic volumes without impacting the network performance.

This highly flexible solution can be easily deployed across different network environments as its open architecture supports Amazon Web Services (AWS), VMWare, Azure, and other hypervisors.

Features

  • Enhanced visibility with Firepower Management Center
  • Constantly updated early-warning system
  • Flexible deployment options for inline inspection or passive detection
  • Cisco Threat Intelligence Director for third-party data ingestion

Pros

  • Real-time data inputs optimize data security
  • Easy integration without major hardware changes
  • High scalability with purpose-built solutions

Cons

  • Expensive for small-scale organizations
  • Initial integration challenges

Pricing

Cisco offers free trials for most products, including its IPS, but does not make its pricing readily available. For details, contact Sales Support.

Fidelis Cybersecurity icon

Fidelis Network

Best for Advanced Threat Detection Response

Fidelis Network improves security efficiency by detecting advanced threats and behavioral anomalies, employing a proactive cyber-defense strategy to more quickly detect and respond to threats before they can affect a business. Fidelis Network can bolster data security with rich insights into bi-directional encrypted traffic.

This specific network defense solution helps prevent future breaches with both real-time and retrospective analysis.

Features

  • Patented Deep Session Inspection for data exfiltration
  • Improved response with the MITRE ATT&CK framework and intelligence feed from Fidelis Cybersecurity
  • Unified network detection and response (NDR) solution for simplified network security
  • Customizable real-time content analysis rules for proactive network security

Pros

  • Faster threat analysis and improved security efficiency
  • Deeper visibility and threat detection with more than 300 metadata attributes
  • Single-view and consolidated network alerts with rich cyber terrain mapping

Cons

  • Complex configuration and setup
  • High-traffic environments cause network latency
  • Tighter integration with other tools is required

Pricing

Fidelis Network offers a 15-day free trial, and will schedule a demo before it to show off the system’s capabilities and features.

Palo Alto Networks icon

Palo Alto Networks Advanced Threat Prevention 

Best for Zero-Day Exploits

Palo Alto Networks’ Advanced Threat Prevention is based on purpose-built, inline deep learning models that secure businesses from the most advanced and evasive threats. Powered by multi-pronged detection mechanisms that efficiently take care of unknown injection attacks and zero-day exploits, this infinitely scalable solution blocks command and control (C2) attacks in real time without compromising performance.

Features

  • ML-Powered NGFWs for complete visibility
  • Customized protection with Snort and Suricata signature support
  • Real-time analysis with enhanced DNS Security Cloud Service
  • Latest security updates from Advanced WildFire

Pros

  • Ultra low-latency native cloud service
  • Combined App-ID and User-ID identification technologies
  • Customized vulnerability signatures
  • Complete DNS threat coverage

Cons

  • Overly complex implementation for simple configurations
  • High upfront costs

Pricing 

Palo Alto Networks offers free trials, hands-on demos, and personalized tours for its products and solutions, but does not make its pricing models publicly available. Contact sales for details.

Trellix icon

Trellix Intrusion Prevention System

Best for On-Prem and Virtual Networks

Trellix Intrusion Prevention System offers comprehensive and effective security for business networks, offering two variants: Trellix Intrusion Prevention System and Trellix Virtual Intrusion Prevention System. The virtual variant takes care of the private and public cloud requirements, and secures virtualized environments using advanced inspection technologies.

Features

  • Botnet intrusion detection across the network
  • Enhanced threat correlation with network threat behavior analysis
  • Inbound and outbound SSL decryption
  • East-west network visibility

Pros

  • Both signature-based and signature-less intrusion detection
  • Unified physical and virtual security
  • Maximum security and performance (scalability up to 100 Gbps)
  • Shared licensing and throughput model

Cons

  • Older variants and models still exist
  • Confusion pricing options
  • High rates of false positives

Pricing

Schedule a demo to learn whether Trellix meets specific requirements. The vendor does not make pricing models publicly available; contact sales.

Key IPS Features

When deciding on an intrusion prevention system, make sure the features and capabilities match specific needs. Key features include the following:

Real-time alerts

Proactive threat detection and prompt incident response require real-time visibility. Timely alerts help implement preventive measures before any significant damage to the security posture. Advanced IPSs have real-time monitoring capabilities to identify potential vulnerabilities and minimize the impact of security incidents.

Integration with other security systems

Intrusion prevention systems cannot operate in isolation. For the efficient protection of the entire business security infrastructure, they must integrate with other security solutions and platforms for a coordinated response. This also helps with the centralized management of security incidents.

Type of intrusion detection

There are mainly two types of intrusion detection: network-based and host-based. While network-based intrusion detection examines and analyzes the network traffic for vulnerabilities, host-based intrusion detection checks individual systems like servers, endpoints, or particular assets.

Automatic updates

Automatic updates can help ensure an IPS adapt to the continuously evolving threat landscape of new threats and newly discovered vulnerabilities. They can also help keep pace with changing compliance and regulatory requirements and implement the latest security patches.

Threat intelligence

Threat intelligence helps an IPS enhance detection capabilities and minimize vulnerabilities with efficient mitigation strategies. With threat intelligence capabilities, IPS solutions access timely and actionable information to develop effective response strategies.

How to Choose an IPS

Here are some factors to consider when choosing an IPS:

Configuration type

There are broadly four types of IPS configurations depending on the network environment, security policies, and requirements where they will be implemented: network-based, host-based, wireless, and network behavior analysis system. Multiple configurations can also support complex pathways.

Detection capabilities

Intrusion prevention systems use different detection techniques to identify malicious activities—primarily signature-based, anomaly-based, and protocol-based. Signature-based detection helps detect consistent cyber threat patterns from a static list of known signatures, while anomaly-based detection can detect abnormalities within normal activity patterns. Protocol-based systems offer the flexibility to set references for benign protocol activities.

Integration options

Intrusion prevention systems can be integrated using dedicated hardware and software, or incorporated within existing enterprise security controls. Businesses that don’t want to upgrade system architecture or invest in products or resources can rely on managed service providers for security, but an IPS purchased and installed on the network offers more control and authority.

Frequently Asked Questions (FAQs)

What is the difference between intrusion detection systems and intrusion prevention systems?

Intrusion detection systems help detect security incidents and threats and send alerts to the Security Operations Center (SOC). Issues are investigated by security personnel and countermeasures executed accordingly. Essentially, they’re monitoring tools. While intrusion prevention systems also detect potential threats and malicious incidents, they automatically take appropriate actions, making them highly proactive, control-based cybersecurity solutions.

How do intrusion prevention systems help businesses?

Intrusion prevention systems are key to enterprise security as they help prevent serious and sophisticated attacks. Some of the key benefits of IPS for businesses are:

  • Reduced strain on IT teams through automated response
  • Customized security controls as per requirements
  • Improved performance by filtering out malicious traffic

Do intrusion prevention systems affect network performance?

Intrusion prevention systems may slow down the network in the case of inadequate bandwidth and capacity, heavy traffic loads, or computational burdens.

Methodology

In order to provide an objective and comprehensive comparison of the various IPSs available in the market, we followed a structured research methodology. We defined evaluation criteria, conducted market research, collected data on each solution, evaluated and scored them, cross-verified our findings, and documented the results. Additionally, we considered user reviews and feedback to gain valuable insights into the real-world performance and customer satisfaction of each intrusion prevention solution.

Bottom Line: Top Intrusion Prevention Systems

The top intrusion prevention systems all work to protect enterprise networks from the ever-present, always evolving threat of cyberattack, but some stand out for different use cases. Selecting the right one will depend on the organization’s security needs, goals, and budget. Regular evaluation and updates are crucial to staying ahead of evolving threats and ensuring a robust security posture—the right IPS can enhance network security, protect sensitive data, and safeguard a business against potential cyber threats.

]]>
EDR vs. NDR vs. XDR: Which Should You Use? https://www.datamation.com/security/edr-vs-ndr-vs-xdr Tue, 25 Apr 2023 22:25:52 +0000 https://www.datamation.com/?p=24059 Endpoint detection and response (EDR), network detection and response (NDR), and extended detection and response (XDR) are closely related categories of threat detection technology. Each of these tools can detect and respond to cyberattacks originating from a variety of sources, but they vary in their sophistication. 

This guide will help you understand how these tools often complement one another within an overarching network security approach. 

  • EDR is best suited for organizations that need to oversee many endpoints, though it is rarely used as a standalone network security solution.
  • NDR is best used when packet inspection is important to an organization, as this tool provides more context versus EDR and XDR.
  • XDR is best used in larger network architectures that could benefit from a centralized, unified approach to threat detection.
For more information, also see: Artificial Intelligence in Cybersecurity

Endpoint Detection & Response (EDR)

EDR, as the name implies, protects networks at each connected endpoint, reducing the risk of network breach and attacks that occur at these oft-targeted locations.These systems identify tangible changes at the endpoint level. In modern enterprise networks, there can be hundreds or even thousands of endpoints connected to networked devices, including IoT devices like sensors and communication devices deployed in the field. 

Advanced EDR systems utilize tools like machine learning and AI to uncover new threats and suspicious behavior and activity. 

EDR Pros

Better protection of endpoints improves organizations’ overall security postures. Bad actors frequently target endpoints, so more protection at these vulnerable network connections is an overall positive. As valuable as EDR tools are, however, most organizations will require additional network security tools as well. This is especially true with more employees working remotely and in hybrid setups.

XDR, outlined below, may provide the best solution for these situations. 

EDR Cons

One significant limitation of EDR is that the detection logs generated by these tools do not always trigger alerts. Organizations will need to perform periodic, manual reviews of endpoint data to prevent cyber attacks. Also, EDR on its own often cannot be deployed on all devices, including many BYOD and IoT devices or in environments like the public cloud. Threat actors seek out these gaps in visibility, looking for opportunities to exploit these vulnerabilities. 

EDR Deployment Methods

EDR is typically deployed in one of two environments: on premise and via the cloud. 

On premise deployment works best for relatively small organizations whose assets are all located in the same geography, especially those that want to keep their data within reach. However, this approach is limited in that EDR deployed on premise can’t support real-time behavioral analysis. Also, the updating process can become laborious and time-consuming. This is also the more expensive option. 

EDR deployed in the cloud offers several advantages over on premise deployment, including more scalability, integrity, flexibility and better overall manageability. However, cloud-based EDR may not offer the same level of security, especially related to industry regulations around data privacy. 

For more information, also see: What is Big Data Security?

EDR integrations

Top rated EDR vendors that provide EDR integration include:

EDR average price

EDR is usually priced per endpoint, per month, with fees starting around $10 per endpoint/per month. 

For more information, also see: Why Firewalls are Important for Network Security

Network Detection & Response (NDR)

NDR is unique to EDR and XDR in that it centers on the analysis of packet data located in network traffic versus endpoints or other data streams to uncover potential cyber threats. Packets contain a wealth of valuable information. 

NDR works by continuously monitoring and recording network traffic, in search of reliable patterns of expected network behavior. NDR uses that pattern to analyze packet data for anomalies of threats and then either alerts the security team or mitigates threats automatically.

Often, NDR solutions are packaged alongside other tools like security information and event management (SIEM) products and EDR, elevating the effectiveness of those cyber security tools by helping to reduce blind spots across a given network. 

NDR Pros

NDR increases security capabilities by equipping security teams with more network context and automated threat response. This contributes to better collaboration between network and security teams, and most important, quicker mitigation of threats and attacks. 

A key benefit of using NDR is the forensic information these systems can provide. Reports generated by NDR can help security determine how malware breached a network initially, information that can then be applied to mitigation solutions. 

NDR can uncover newer and more evolved malware, including polymorphic malware. It can also target so-called weaponized AI. 

NDR Cons

NDR does come with some limitations. First, these solutions can only analyze network logs — NDR cannot monitor or track endpoint events like process details, registry changes, or system commands. NDR is also unable to examine some cloud or identity data and some other sources of security information. 

These limitations underscore why NDR, like EDR, is not generally utilized as a stand alone security solution. It is a tool that can enhance an overarching security approach. 

NDR Deployment Methods

Like EDR, NDR can be deployed on premise and via cloud-based solutions, depending on organizational needs. 

On premise NDR deployment is better suited for organizations whose assets are all located in the same geography, especially those that want to keep their data within reach. Like EDR, updating NDR can become laborious and time-consuming and is the more expensive option versus cloud-based deployment. 

NDR can also be deployed in the cloud, which offers several advantages — more scalability, integrity, flexibility and better overall manageability. However, cloud-based NDR is, again, not as secure as on premise deployment and may not be well suited for organizations that need to adhere to various data privacy regulations. 

NDR integrations

Top rated NDR vendors that provide NDR integration include:

NDR Average Price

NDR is typically priced per user, per month, starting around $20 per user, per month for medium sized organizations. 

For more information, also see: What is Firewall as a Service? 

Extended Detection & Response (XDR)

Of the three threat detection approaches compared here, XDR is most advanced and, unsurprisingly, provides the most holistic protection against cyber attacks.

One way to think of XDR is that is, in many ways, an evolution of EDR and NDR that integrates network, application, and cloud data sources to respond quickly and effectively to threats, as they emerge. There are three main XDR platform categories:

  • Native XDR, which works exclusively with products from a single vendor.
  • Open XDR, which works with all vendors.
  • Hybrid XDR, which is capable of integrating data from some outside vendors, with limitations.

XDR Pros

XDR solutions are more proactive when it comes to threat detection and response. These platforms centralize visibility across multiple data streams, including endpoint data, network data, and cloud data. Used alongside tools like SIEM and security orchestration, automation, and response (SOAR), XDR is capable of addressing very complex threats. 

XDR Cons

While XDR is attractive to organizations seeking to centralize cyber security oversight across multiple data types, most will still want to tap into the context provided by tools like NDR. 

XDR solutions can be expensive, even beyond the actual platform and vendor agreement. Organizations may need to retrain employees or hire expert staff to run these tools because they are more complex to deploy and maintain. As the cyber threatscape evolves, XDR will need to be enhanced periodically as well, which will incur additional costs. 

XDR Deployment Methods

Like NDR and EDR, XDR can be deployed on premise, in the cloud, or via a hybrid arrangement. Most organizations investing in a solution like XDR will deploy into a hybrid environment. 

Top rated XDR vendors that provide XDR integration include:

XDR Average Price

Similar to NDR, XDR is usually priced per user (or license), per month, starting at around $60 per user/month. 

For more information, also see: How to Secure a Network: 9 Steps

Bottom line: EDR vs. NDR vs. XDR

While all three threat detection solutions do, in fact, work to detect threats, EDR, NDR, and XDR vary in their capabilities.

EDR can monitor and mitigate endpoint attacks, but is limited in scope. At the other end of the threat detection spectrum, XDR offers benefits like a more unified platform approach — however, XDR reporting often lacks the network context available through an NDR solution that offers real-time packet monitoring. 

Many large organizations need solutions that incorporate both network and endpoint data monitoring with other, overarching security tools in order to gain a true, real-time viewpoint of network behavior. A comprehensive enterprise security solution often includes NDR, EDR, XDR, SIEM, and SOAR. 

On a related topic, also see: Top Cybersecurity Software

]]>