Tag Archives: Enterprise

WhiteSource Rolls Out New Open Source Security Detector | Enterprise


By Jack M. Germain

May 24, 2018 10:24 AM PT

WhiteSource on Tuesday launched its next-generation software composition analysis (SCA) technology, dubbed “Effective Usage Analysis,” with the promise that it can reduce open source vulnerability alerts by 70 percent.

The newly developed technology provides details beyond which components are present in the application. It provides actionable insights into how components are being used. It also evaluates their impact on the security of the application.

The new solution shows which vulnerabilities are effective. For instance, it can identify which vulnerabilities get calls from the proprietary code.

It also underscores the impact of open source code on the overall security of the application and shows which vulnerabilities are ineffective. Effective Usage Analysis technology allows security and engineering teams to cut through the noise to enable correct prioritization of threats to the security of their products, according to WhiteSource CEO Rami Sass.

“Prioritization is key for managing time and limited resources. By showing security and engineering teams which vulnerable functionalities are the most critical and require their immediate attention, we are giving them the confidence to plan their operations and optimize remediation,” he said.

The company’s goal is to empower businesses to develop better software by harnessing the power of open source. In its Software Composition Analysis (SCA) Wave report in 2017, Forrester recognized the company as the best current offering.

WhiteSource’s new Effective Usage Analysis offering addresses an ongoing challenge for open source developers: to identify and correct identifiable security vulnerabilities proactively, instead of watching or fixing problems after the fact, said Charles King, principal analyst at Pund-IT.

“That should result in applications that are more inherently secure and also improve the efficiency of developers and teams,” he told LinuxInsider. “Effective Usage Analysis appears to be a solid individual solution that is also complementary and additive to WhiteSource’s other open source security offerings.”

Open Source Imperative

As open source usage has increased, so has the number of alerts on open source components with known vulnerabilities. Security teams have become overloaded with security alerts, according to David Habusha, vice president of product at WhiteSource.

“We wanted to help security teams to prioritize the critical vulnerabilities they need to deal with first, and increase the developers’ confidence that the open source vulnerabilities they are being asked to fix are the most pressing issues that are exposing their applications to threats,” he told LinuxInsider.

The current technology in the market is limited to detecting which vulnerable open source components are in your application, he said. They cannot provide any details on how those components are being used, or the impact of each vulnerable functionality to the security of the application.

The new technology currently supports Java and JavaScript. The company plans to expand its capabilities to include additional programming languages. Effective Usage Analysis is currently in beta testing and will be fully available in June.

How It Works

Effective Usage Analysis promises to cut down open source vulnerabilities alerts dramatically by showing which vulnerabilities are effective (getting calls from the proprietary code that impact the security of the application) and which ones are ineffective.

Only 30 percent of reported alerts on open source components with known vulnerabilities originated from effective vulnerabilities and required high prioritization for remediation, found a WhiteSource internal research study on Java applications.

Effective Usage Analysis also will provide actionable insights to developers for remediating a vulnerability by providing a full trace analysis to pinpoint the path to the vulnerability. It adds an innovative level of resolution for understanding which functionalities are effective.

This approach aims to reduce open source vulnerability alerts and provide actionable insights. It identifies the vulnerabilities’ exact locations in the code to enable faster, more efficient remediation.

A Better Mousetrap

Effective Usage Analysis is an innovative technology representing a radical new approach to effectiveness analysis that may be applied to a variety of use cases, said WhiteSource’s Habusha. SCA tools traditionally identify security vulnerabilities associated with an open source component by matching its calculated digital signature with an entry stored in a specialized database maintained by the SCA vendor.

SCA tools retrieve data for that entry based on reported vulnerabilities in repositories such as the
NVD, the U.S. government repository of standards-based vulnerabilities.

“While the traditional approach can identify open source components for which security vulnerabilities are reported, it does not establish if the customer’s proprietary code actually references — explicitly or implicitly — entities reported as vulnerable in such components,” said Habusha.

WhiteSource’s new product is an added component that targets both security professionals and developers. It helps application security professionals prioritize their security alerts and quickly detect the critical problems that demand their immediate attention.

It helps developers by mapping the path from their proprietary code to the vulnerable open source functionality, providing insights into how they are using the vulnerable functionality and how the issues can be fixed.

Different Bait

Effective Usage Analysis employs a new scanning process that includes the following steps:

  • Scanning customer code;
  • Analyzing how the code interacts with open source components;
  • Indicating if reported vulnerabilities are effectively referenced by such code; and
  • Identifying where that happens.

It employs a combination of advanced algorithms, a comprehensive knowledge base, and a fresh new user interface to accomplish those tasks. Effective Usage Analysis enables customers to establish whether reported vulnerabilities constitute a real risk.

“That allows for a significant potential reduction in development efforts and higher development process efficiency,” said Habusha.

Potential Silver Bullet

WhiteSource’s new solution has the potential to be a better detection tool for open source vulnerabilities, suggested Avi Chesla, CTO of
Empow Cyber Security. The new detection tools will allow developers to understand the potential risk associated with the vulnerabilities.

The tools “will ultimately motivate developers to fix them before releasing a new version. Or at least release a version with known risks that will allow the users to effectively manage the risks through external security tools and controls,” he told LinuxInsider.

The new approach matters, because the long-standing existing vulnerabilities are and should be known to the industry, Chesla explained. It offers a better chance that security tools will detect exploitation attempts against them.

Effective Usage Analysis is probably the most important factor because developers are flooded with alerts, or noise. The work of analyzing the noise-to-signal ratio is time-consuming and requires cybersecurity expertise, noted Chesla.

The “true” signals are the alerts that represent a vulnerability that actually can be exploited and lead to a real security breach. The cybersecurity market deals with this issue on a daily basis.

“Security analysts are flooded with logs and alerts coming from security tools and experience a similar challenge to identify which alerts represent a real attack intent in time,” Chesla pointed out.

Equifax Factor

The major vulnerability that compromised Equifax last year sent security experts and software devs scrambling for effective fixes. However, it is often a business decision, rather than a security solution, that most influences software decisions, suggested Ed Price, director of compliance and senior solution architect at
Devbridge Group.

“Any tools that make it easier for the engineering team to react and make the code more secure are a value-add,” he told LinuxInsider.

In some cases, the upgrade of a single library, which then cascades down the dependency tree, will create a monumental task that cannot be fixed in a single sprint or a reasonable timeframe, Price added.

“In many cases, the decision is taken out of the hands of the engineering team and business takes on the risk of deploying code without the fixes and living with the risk,” Price said, adding that no tool — open source or otherwise — will change this business decision.

“Typically, this behavior will only change in an organization once an ‘Equifax event’ occurs and there is a penalty in some form to the business,” he noted.

Saving Code Writers’ Faces

WhiteSource’s new tool is another market entry that aims to make sense of the interconnected technologies used in enterprise environments, suggested Chris Roberts, chief security architect at
Acalvio.

“The simple fact of the matter is, we willingly use code that others have written, cobbling things together in an ever increasingly complex puzzle of collaborative code bases,” he told LinuxInsider, “and then we wonder why the researchers and criminals can find avenues in. It is good to see someone working hard to address these issues.”

The technologies will help if people both pay attention and learn from the mistakes being made. It is an if/and situation, Roberts said.

The logic is as follows: *If* I find a new tool that helps me understand the millions of lines of code that I have to manage or build as part of a project, *and* the understanding that the number of errors per 100 lines is still unacceptable, then a technology that unravels those complexities, dependencies and libraries is going to help, he explained.

“We need to use it as a learning tool and not another crutch or Band-Aid to further mask the garbage we are selling to people,” Roberts said.

Necessary Path

Hackers love open source software security vulnerabilities because they are a road map for exploiting unpatched systems, observed Tae-Jin Kang, CEO of
Insignary. Given that the number of vulnerabilities hit a record in 2017, according to the CVE database, finding the vulnerabilities is the best, first line of defense.

“Once they are found in the code and patched, then it is appropriate to begin leveraging technologies to deal with higher-order, zero-day issues,” Kang told LinuxInsider.

Organizations for years have looked to push back the day of reckoning with regard to OSS security vulnerabilities. They have been viewed as trivial, while engineering debt has piled up.

“Equifax has been the clearest illustration of what happens when these two trends meet,” said Kang. “With the implementation of GDPR rules, businesses need to get more aggressive about uncovering and patching security vulnerabilities, because the European Union’s penalties have teeth.”


Jack M. Germain has been an ECT News Network reporter since 2003. His main areas of focus are enterprise IT, Linux and open source technologies. He has written numerous reviews of Linux distros and other open source software.
Email Jack.





Source link

Can Hackers Crack the Ivory Towers? | Enterprise


Just like leaders in every other field you can imagine, academics have been hard at work studying information security. Most fields aren’t as replete with hackers as information security, though, and their contributions are felt much more strongly in the private sector than in academia.

The differing motives and professional cultures of the two groups act as barriers to direct collaboration, noted
Anita Nikolich in her “Hacking Academia” presentation at the
CypherCon hacking conference recently held in Milwaukee. Nikolich recently finished her term as the program director for cybersecurity at the National Science Foundation’s Division of Advanced Cyberinfrastructure.

For starters, academics and hackers have very distinct incentives.

“The topics of interest tend to be the same — the incentives are very different,” Nikolich said.

“In the academic community, it’s all about getting tenure, and you do that by getting published in a subset of serious journals and speaking at a subset of what they call ‘top conferences,'” she explained. “For the hacker world … it could be to make the world a better place, to fix things, [or] it could be to just break things for fun.”

These differences in motivations lead to differences in perception — particularly in that the hacker community’s more mischievous air discourages academics from associating with them.

“There is still quite a bit of perception that if you bring on a hacker you’re not going to be able to put boundaries on their activity, and it will harm your reputation as an academic.” Nikolich said.

Deep Rift

The perception problem is something other academics also have observed.

The work of hackers holds promise in bolstering that of academics, noted Massimo DiPierro, a professor at
DePaul College of Computing and Digital Media.

Hackers’ findings are edifying even as things stand, he contended, but working side-by-side with one has the potential to damage an academic’s career.

“I think referencing their research is not a problem. I’ve not seen it done much [but] I don’t see that as a problem,” DiPierro said. “Some kind of collaboration with a company is definitely valuable. Having it with a hacker — well, hackers can provide information so we do want that, but we don’t want that person to be labeled as a ‘hacker.'”

Far from not working actively with hackers, many academics don’t even want to be seen with hackers — even at events such as CypherCon, where Nikolich gave her presentation.

“It’s all a matter of reputation. Academics — 90 percent of them have told me they don’t want to be seen at hacker cons,” she said.

Root Causes

While both researchers agreed that their colleagues would gain from incorporating hackers’ discoveries into their own work, they diverged when diagnosing the source of the gulf between the two camps and, to a degree, even on the extent of the rift.

Academic papers have been infamously difficult to get access to, and that is still the case, Nikolich observed.

“Hackers, I found, will definitely read and mine through the academic literature — if they can access it,” she said.

However, it has become easier for hackers to avail themselves of the fruits of academic study, according to DiPierro.

“A specific paper may be behind a paywall, but the results of certain research will be known,” he said.

On the other hand, academia moves too slowly and too conservatively to keep up with the private sector, DiPierro maintained, and with the hackers whose curiosity reinforces it. This limited approach is due in part to the tendency of university researchers to look at protocols in isolation, rather than look at how they are put into practice.

“I think most people who do research do it based on reading documentation, protocol validation, [and] looking for problems in the protocol more than the actual implementation of the protocol,” he said.

Risk Taking

That’s not to say that DiPierro took issue with academia’s model entirely — quite the contrary. One of its strengths is that the results of university studies are disseminated to the public to further advance the field, he pointed out.

Still, there’s no reason academics can’t continue to serve the public interest while broadening the scope of their research to encompass the practical realities of security, in DiPierro’s view.

“I think, in general, industry should learn [public-mindedness] from academia, and academia should learn some of the methodologies of industry, which includes hackers,” DiPierro said. “They should learn to take a little bit more risks and look at more real-life problems.”

Academics could stand to be more adventurous, Nikolich said, but the constant pursuit of tenure is a restraining force.

“I think on the academic side, many of them are very curious, but what they can learn — and some of them have this — is to take a risk,” she suggested. “With the funding agencies and the model that there is now, they are not willing to take risks and try things that might show failure.”

Financial Incentives

While Nicolich and DiPierro might disagree on the root cause of the breakdown between hackers and academic researchers, their approaches to addressing it are closely aligned. One solution is to allow anyone conducting security research to dig deeper into the systems under evaluation.

For Nikolich, that means not only empowering academia to actively test vulnerabilities, but to compensate hackers enough for them to devote themselves to full-time research.

“Academics should be able to do offensive research,” she said. “I think that hackers should have financial incentive, they should be able to get grants — whether it’s from industry, from the private sector, from government — to do their thing.”

In DiPierro’s view, it means freeing researchers, primarily hackers, from the threat of financial or legal consequences for seeking out vulnerabilities for disclosure.

“I would say, first of all, if anything is accessible, it should be accessible,” he said. “If you find something and you think that what you find should not have been accessible, [that] it was a mistake to make it accessible, you [should] have to report it. But the concept of probing for availability of certain information should be legal, because I think it’s a service.”


Jonathan Terrasi has been an ECT News Network columnist since 2017. His main interests are computer security (particularly with the Linux desktop), encryption, and analysis of politics and current affairs. He is a full-time freelance writer and musician. His background includes providing technical commentaries and analyses in articles published by the Chicago Committee to Defend the Bill of Rights.





Source link

Flash Storage Adoption in the Enterprise


We’ve heard for a while that flash storage is going mainstream, but how are companies actually using it and what results are they getting? A new report by IT analyst firm Evaluator Group sheds light on enterprise adoption of solid-state storage and why the technology has become so popular.

The firm, which specializes in analysis of data storage and information management, surveyed larger enterprises with more than 1,000 employees that had already deployed all-flash systems. That kept the study focused on organizations with first-hand experience with solid-state storage, Randy Kerns, senior strategist and analyst at Evaluator Group, told me in an interview. After the survey, which was conducted across various vertical markets, analysts interviewed many of the participants to get deeper insight.

Evaluator Group found that most of those surveyed bought all-flash arrays with the goal of speeding database performance so that certain applications ran faster. “The majority of them justified paying extra based on getting the databases to run faster,” Kerns said.

Another top use case was accelerating virtual machine environments, which involves supporting more virtual machines per physical server due to the improved performance with solid-state technology, he said.

Enterprises reported strong results with their flash storage deployments, the study found.

“In all cases, they got what they expected and more, to the point that they added additional workloads that weren’t performance demanding…They had more capabilities than they planned on, so they added more workloads to their environment,” Kern said. “And the future is adding more workloads or buying more all-flash systems for putting more workloads on.”

Organizations surveyed also reported improved reliability, with fewer interruptions either due to a device or system failure. “That was a big improvement for them,” he said. “It’s something they hadn’t counted on in their initial purchase.”

Survey participants said they valued the data protection capabilities of solid-state storage systems, such as snapshots. “The systems had the capabilities to do things differently so they could accelerate their data protection processes,” Kerns said.

Data reduction functionality wasn’t high on their list of solid-state features, as they considered it a basic capability of flash storage systems, according to Evaluator Group.

While solid-state storage has a reputation for being pricey, it wasn’t an issue for the survey participants, Kerns said. “These people already had them [all-flash systems], so the battle about cost is in the rear view mirror,” he said. “First-time buyers may have a sticker-shock issue, but for those who bought it, that’s history.”

When buying flash storage, enterprises tend to turn to their current storage systems vendor, the study found. “Incumbency wins,” Kerns said. A few bought from storage startups, but the majority preferred to stick with their existing vendor, enjoying new systems that operated in a similar fashion what they already had.

As for going all-flash, enterprises expect that will be the case eventually, but certainly won’t happen overnight. “They have a number of platforms that have a certain lifespan. They’ll just age those systems out, so it could be a number of years until they get to that point,” Kerns said.

Get live advice on networking, storage, and data center technologies to build the foundation to support software-driven IT and the cloud. Attend the Infrastructure Track at Interop ITX, April 30-May 4, 2018. Register now!

 



Source link

Top Trends Impacting Enterprise Infrastructure


Enterprise infrastructure teams are under massive pressure as the cloud continues to upend traditional IT architectures and ways of providing service to the business. Companies are on a quest to reap the speed and agility benefits of cloud and automation, and infrastructure pros must keep up.

In this rapidly changing IT environment, new technologies are challenging the status quo. Traditional gear such as dedicated servers, storage arrays, and network hardware still have their place, but companies are increasingly looking to the cloud, automation, and software-defined technologies to pursue their digital initiatives.

According to IDC, by 2020, the heavy workload demands of next-generation applications and IT architectures will have forced 55% of enterprises to modernize their data center assets by updating their existing facilities or deploying new facilities.

Moreover, by the end of next year, the need for better agility and manageability will lead companies focused on digital transformation to migrate more than 50% of their IT infrastructure in their data center and edge locations to a software-defined model, IDC predicts. This shift will speed adoption of advanced architectures such as containers, analysts said.

Keith Townsend, founder of The CTO Advisor and Interop ITX Infrastructure Track Chair, keeps a close eye the evolution of IT infrastructure. On the next pages, read his advice on what he sees as the top technologies and trends for infrastructure pros today: hyperconvergence, network disaggregation, cloud migration strategies, and new abstraction layers such as containers.

(Image: Timofeev Vladimir/Shutterstock)

Get live advice on networking, storage, and data center technologies to build the foundation to support software-driven IT and the cloud. Attend the Infrastructure Track at Interop ITX, April 30-May 4, 2018. Register now!

 



Source link

6 Hot Tech Trends That Will Impact the Enterprise in 2018


The start of a new year always brings a flood of forecasts from technology pundits for what might happen in the next 12 months. For some reason, 2018 triggered even more prognostications from tech experts than usual. We received dozens of predictions for networking, storage, and data center trends that IT pros should expect to see this year.

After sorting through them, we noticed a pattern: many experts predict more of the same.  The trends and hot technologies from 2017 such as machine learning and automation will continue to influence IT infrastructure into 2018, but the pace and intensity of innovation and adoption seems likely to increase.

“It’s no secret that AI and machine learning are driving a lot of the innovation across the various ecosystems and technology domains that IT cares about,” Rohit Mehra, program VP of network infrastructure at IDC, said in a webcast on the firm’s 2018 predictions for worldwide enterprise infrastructure.

In fact, the rapid incorporation of AI into the workplace will mean that by 2021, more than half of enterprise infrastructure will use some form of cognitive and artificial intelligence to improve productivity, manage risk, and reduce costs, according to IDC.  

To be sure, 2018 will another year of rapid change for IT infrastructure. Read ahead for six key tech trends that infrastructure pros should keep an eye on in the months ahead.

(Image: alleachday/Shutterstock)



Source link