Tag Archives: hosting

Why Zero-Trust Companies Should Consider 100% Biometric Authentication | IT Infrastructure Advice, Discussion, Community


Companies across the globe share the same struggle when it comes to hardware, software, and network security. How do you keep all of your internal resources and assets safe while not interfering with daily operations within your organization? New solutions and strategies have emerged in recent years to make corporate security a more efficient process.

One concept that has gained traction in recent years is the zero-trust model that leverages biometrics in the workplace. Instead of trusting that a password is enough to verify someone’s identity, biometrics goes one step further and uses a physical element to authenticate users. This can include fingerprint scanning, retina scanning, or some other biological feature.

But what does the zero-trust model actually mean in the modern workplace and how can biometrics can play a key role?

The old model

Decades ago, enterprise IT architecture followed relatively simple models. Companies usually had a set of back-end servers that handled database, web hosting, and application processing duties. To protect them, a firewall was placed around the perimeter of the assets to filter network traffic and block potential attacks.

In order to connect to one of the back-end servers in this traditional model, a user or administrator would simply authenticate with their network credentials. If they happened to be located remotely, they could use a virtual private network (VPN) tool to tunnel their web traffic through the corporate IP address range.

But no matter how strict your firewall policy was configured, there was a great deal of risk in trusting password-based authentication. The security perimeter was only as strong as your weakest link, meaning that if one user’s password was exposed or lost, it could mean disaster for the entire organization.

The need for zero-trust

Nowadays, enterprise networks and system architectures are constantly expanding and evolving to meet the changing needs of a business. It’s no longer possible to maintain a single firewall perimeter around the various pieces of infrastructure and digital systems being hosted and used.

To address the current predicament facing many organizations, the term zero-trust security has emerged. At a basic level, zero-trust refers to an approach to requiring valid authentication before any access or permission is given to a user, even if they are physically located within the private network. The zero-trust model is designed to safeguard against the risks that come with an increased reliance on the more complex model of cloud computing, especially when it comes to data storage and the hosting that goes along with it.

Even the most oft-recommended cloud hosting companies encounter technical or security issues from time to time, as is evidenced by the recent outage of Google’s Public Cloud – a competitor to Microsoft Azure and Amazon AWS. According to Gary Stevens of community research group HostingCanada.org, many of the ostensibly ‘best’ web hosts have uptimes of only 98 perecent annually. That may sound like a great amount of uptime, but 2 perecent of a year equates to more than seven days of being offline. Ultimately, this means that some of your core resources and data will inevitably live outside your company’s immediate control. This is where zero-rust can help.

One of the core pillars of zero-trust network security is called the principle of least privilege (POLP). This stipulates that each individual user in an organization should only be granted the minimum access required in order to fulfill their job duties, a concept that has not been universally embraced by employees. Administrative accounts are not used to gain elevated privileges.

To accompany the POLP principle, organizations need to move past the single perimeter model and adopt a strategy called microsegmentation. This involves analyzing your entire IT infrastructure, including cloud resources, and breaking it up into the smallest pieces possible. From there, access should be controlled at the smallest segment.

Best practices with biometrics

Many data breaches that you hear about in the news are a result of poor password management practices. To help combat this, more companies are beginning to require multi-factor authentication (MFA) for their critical applications and data repositories.

With MFA, each user has to log in with their normal account and password and will then be prompted to verify their identity a second way, typically through a text message code to their cell phone. But of course, if a hacker has managed to compromise a person’s account, there is a good chance their phone messages could be vulnerable as well.

To truly adopt a zero-trust security approach, the best solution is to integrate biometrics into your access workflows. It represents the only way to have full confidence in identity validation, given the strength of fingerprint or retina scanning, as well as vocal and facial recognition.

Looking ahead

Investing in biometric scanning technology may be daunting for some organizations, especially smaller companies working within a tight IT budget. Nevertheless, network security should be treated as a top priority and fortunately there are tools on the market today to make that more feasible.

The concept of bring your own device (BYOD) to work is usually seen as a security risk, though companies that have already taken a preventative approach built around traditional tools like security software, VPN-encrypted network internet connections, and firewall protection may find that the technology on smartphones can actually be leveraged in a zero-trust model. Instead of acquiring separate biometric hardware, the second level of verification can occur through a phone’s fingerprint scanner, microphone, or camera.

At some point in the future, there’s a high likelihood that the password model will become antiquated. Companies will have no choice but to adopt the zero-trust network strategy in order to safeguard their internal data and systems. It will be the new normal to authenticate yourself using biometrics for every action you make with a piece of technology.

Final thoughts

Modern companies face increasing challenges when it comes to protecting their IT systems. With the advancement of cloud computing, enterprise technology is becoming more spread out and harder to control. Putting excessive security on top of systems can slow down the operations of a business and create more problems than it solves.

With the zero-trust network model, an organization forces each user to authenticate themselves before they can perform an action on a server or other piece of infrastructure. Password authentication comes with a huge list of vulnerabilities, which means other solutions should be strongly considered.

Investing in biometrics can greatly reduce a company’s risk profile because of the accuracy involved in fingerprint, retina, voice, and face scanning. Whether a cyberattack starts from internal or external threats, having a zero-trust policy in place fortified by biometrics can help stop the damage.



Source link

The Network Edge: Stretching the Boundaries of SD-WAN | IT Infrastructure Advice, Discussion, Community


The advent of SD-WAN has dramatically disrupted the enterprise networking landscape in the last five years. Industry experts and analysts opine that it is unlike anything they have seen in decades in the networking arena. Leading SD-WAN solutions have enabled dramatic real-time application performance improvements, simplicity, and automation for implementation and management of wide-area networks, and optimized cloud access.

However, SD-WAN technology is still evolving. New functionalities and integrations are being added at a rapid pace. The boundaries of SD-WAN are now stretching deeper and broader beyond just the WAN edge into the “Network Edge.” The Network Edge is enabling the next wave of network transformation by absorbing new functions including compute, analytics, security, and multi-cloud, that are all critical to supporting enterprise locations where business is conducted.

In SD-WAN implementations organizations deploy an edge device at the branch, and edge devices close to their workloads – the public cloud, and the data center – to create a mesh of connections between locations with direct access to the cloud that avoids backhaul to the corporate data center. Optionally, they can place a virtual edge device in the cloud, to provide additional control and optimization for IaaS hosted applications. SD-WAN is managed from a central console using templates and automation. This approach to network design and simplified management is vital to the evolution of the WAN from a traditional hub and spoke that was difficult to configure, especially at scale, and limited in how it served applications in the cloud.

Given where SD-WAN resides strategically in the network and its management capabilities, SD-WAN has become the platform of choice for the evolution of the network edge. Let’s look at how the edge is developing and how advanced SD-WAN solutions make this evolution possible.

The need for edge compute services

The “branch” has evolved beyond the traditionally understood branch office confined by four walls. The Internet of Things (IoT) and mobility are redefining the branch which can now be an agricultural field with machines and devices that interact with each other. This new paradigm has increased requirements for edge services. This has created the option for the edge to go deeper into the branch office and cross over the LAN boundary to support IoT device traffic. This approach needs an advanced SD-WAN platform that is capable of delivering compute services at the edge. With edge compute, one major challenge is how to manage deployment and configuration of services. Advanced SD-WAN solutions such as VMware SD-WAN provide a virtualization infrastructure for hosting the services and centralized management of the platforms so that edge compute services can easily be delivered. 

The need for quick access to broadband

While transport independence is a hallmark of SD-WAN, easy access to broadband is a growing requirement of the edge. There is an emerging approach that could leverage 5G for a low latency connection and on-demand control. It will be possible to deploy 5G in a very short time compared to landlines, making it ideal for use cases such as pop-up stores and temporary field sites. 5G will be versatile, too. Organizations will be able to specify on the fly what type of throughput and network characteristics they want. Once that is done, the right link configuration is automatically applied to deliver the specified connection. 5G would allow advanced SD-WAN platforms to consider the network underlay as not just one underlay, but a configurable underlay. It is programmable so organizations can ask for the specifications they need with regard to bandwidth and traffic handling. The ability to run an overlay with the intelligence of SD-WAN counting on the underlay intelligence of 5G is revolutionary. This approach is focused using 5G as a transport mechanism for enterprise data, not 5G consumer phones.

The need for multi-region networks

Continuing with the evolution of how applications are accessed, one can see a need to span telco networks to serve the needs of global corporations. This will be achieved by using a federation of VMware SD-WAN Gateways to create an over the top (OTT) service that can interoperate gateway to gateway connecting independent telco networks. For example, if one telco network doesn’t reach a geography where the corporation has a presence, then the organization can use the federated gateways to link to other telco networks. These federated gateways extend the telco’s network beyond the facilities that they own, creating a global virtual WAN in a telco-to-telco federation.

The need for a service delivery platform

The next piece of the SD-WAN evolution is SD-WAN as a platform. There are many services that organizations need to run at their branch offices. However, they have a concern about device sprawl and ease of management of these services. Deploying a service as a virtual network function (VNF) eliminates the need for separate hardware at the branch office location. Again advanced SD-WAN solutions such as VMware SD-WAN provide an NFV infrastructure for this, making deployment and management easy. This allows organizations to deliver additional services from the edge platform. Network analytics is a popular choice for this type of service. Companies can take analytics from the edge SD-WAN platform and correlate them with analytics from other devices such as servers, end-user devices, switches, and routers, to check for anomalous behavior and discover the root cause. This can be used to reduce the time to resolve network performance issues greatly. With SD-WAN as a platform, organizations can deploy virtualized functions and manage them from the same console.

The need for a hybrid and multi-cloud

The final evolution in access to applications that SD-WAN needs to support is hybrid and multi-cloud integration. As organizations continue to increase their use of the cloud to host applications and use SaaS applications, direct access with high performance is critical. Applications that are hosted in the public cloud or if the organization is using SaaS applications can use advanced cloud-hosted multi-tenanted SD-WAN gateways to direct traffic to these applications. The gateway does the traffic steering and provides optimizations between it and the edge device.

There are some instances where part of the application resides in the data center and part resides in the cloud, creating a hybrid cloud model. In this case, SD-WAN needs to create optimized connections to both locations and handle traffic steering appropriately. Furthermore, some organizations utilize multiple clouds for hosting the applications, so the SD-WAN solution needs to provide optimized connections to each of the clouds and manage traffic to and between them.

The vision for the edge

These evolutionary areas are where we see SD-WAN headed, and we call this direction the new Network Edge because it’s beyond the traditional SD-WAN functions. It includes edge computing, fast deployment of intelligent high-speed connections, SD-WAN as a broader service delivery platform, connecting multiple networks, and integrating with hybrid and multi-cloud models. These are all aspects of features that go beyond the definition of today’s SD-WAN and enable the evolution of the WAN.

 



Source link

Mirai is Back and Tougher than Before | IT Infrastructure Advice, Discussion, Community


Mirai, the highly disruptive malware strain that got its name from a 2011 Japanese TV show, is back on the beat and even “better” than before. Programmers have modified the original botnet beast, and it’s now screeching its way through enterprise-level Internet of things (IoT) devices.

The original Mirai crash landed in 2016. A sophisticated piece of malware programming, it snatched control of networked devices and assimilated them into a ferocious botnet. Even low-level programmers were able to access thousands of gadgets and computers and to orchestrate distributed denial of service attacks. ADSL modems, routers and network cameras proved most vulnerable to the well-engineered strain.

Mirai: A DDoD powerhouse

Ultimately, Mirai played a central role in several infamous distributed denial of service raids against multiple high-profile targets including the French hosting company OVH.com, the website of venerated online security reporter Brian Krebs and DNS server provider Dyn, which crippled popular sites like Reddit, GitHub, Airbnb and Netflix for a period. Rutgers University and the African country of Liberia also suffered under the malware’s grip.

And for months, Mirai’s author remained anonymous. Eventually, the malware entered the halls of hacker infamy. James Ferraro, an electronic composer and musician, even name-checked the notorious Trojan on his 2018 album “Four Pieces for Mirai.”

However, in 2017, Krebs revealed his suspicion that a programmer going by the alias Anna-senpai — government name: Paras Jha — penned Mirai. A student at Rutgers with a dorm-room business, Jha initially denied the charges. Then the FBI got involved, and on December 13, 2017, Jha and two other people pled guilty to criminal errands related to the Mirai botnet attacks. Ultimately, a judge sentenced Jha to six months behind bars and slapped him with an $8.6 million fine.

Mirai is back and more dangerous

Before Jha and his co-conspirators reported to authorities for incarceration, Mirai’s source code found its way online, and likeminded programmers took up the mantle. The result: new Mirai strains that can weasel their way into enterprise IoT devices and make use of all that business bandwidth, which could, theoretically, result in an attack of historic proportions.

In the fall of 2018, researcher Matthew Bing explained in a blog post:

“Like many IoT devices, unpatched Linux servers linger on the network and are being abused at scale by attackers sending exploits to every vulnerable server they can find. [We have] been monitoring exploit attempts for the Hadoop YARN vulnerability in our honeypot network and found a familiar, but surprising payload – Mirai.”

Vulnerable devices

According to Kaspersky Labs, second-generation Mirai strains represent about 21 percent of all IoT device pollutants. Additionally, the latest versions are even more flexible than the original and can exploit a wider range of targets, including enterprise-class controllers, wireless presentation systems, and digital signage. Analysts warn that the following devices are particularly vulnerable:

  • DLink DCS-930L network video cameras;
  • DLink DCS-930L network video cameras;
  • Netgear WG102, WG103, WN604, WNDAP350, WNDAP360, WNAP320, WNAP210, WNDAP660, WNDAP620 devices;
  • Netgear DGN2200 N300 Wireless ADSL2+ modem routers;
  • Netgear Prosafe WC9500, WC7600, WC7520 wireless controllers;
  • ePresent WiPG-1000 wireless presentation systems;
  • LG Supersign TVs;
  • DLink DIR-645, DIR-815 routers; and
  • Zyxel P660HN-T routers.

Many security experts strongly suspect that Industrial IoT devices may now also be vulnerable.

Guarding Against a Mirai Infection

Now that you know what Mirai is, you’re probably wondering: What measures should be taken to prevent infection?

Researchers and engineers, including the team at one of the best vpn services of 2019, unanimously agree that IT divisions should:

  • Take inventory of all IoT devices connected to their networks
  • Change default passwords across the board
  • Ensure that every device connected to the Internet is up-to-date on patches
  • Create a preventative strategy that includes firewalls, vpn, and anti-virus and anti-malware software

It may even be worth the investment to bring in a third-party expert to ensure your system is locked down properly. Companies that don’t have an in-house IT department should definitely summon a security professional for a threat of Mirai’s magnitude.

Businesses aren’t the only ones who must worry about Mirai. Every individual with a home network should also take measures to protect against the malicious virus. Many home routers come with a default backdoor that hackers can easily exploit. Making a network unattractive to Mirai-wielding ne’er-do-wells simply involves changing the default credentials.

Online privacy concerns and compliance

Malware is part of an ever-expanding landscape of online privacy concerns. And as legislation grows up around technological advancements, businesses need to be more cognizant of the intersection between data safekeeping and government breach regulations.

For example, did you know that in many jurisdictions, under certain circumstances, companies can be held legally and financially responsible for data breaches? So be sure to take reasonable steps to indemnify your company from possible punishment in the event of an attack.

The bottom line

Everyone needs to be aware of the threat that Mirai and its malware spawn present. Get your network shored up sooner rather than later, because the next big Mirai-rooted attack will likely cause tremendous chaos, the likes of which the world has never seen.

Read more Network Computing security-related articles:

Four Tips to Worsen Your Network Security

The Missing Piece in Cloud App Security

Five Steps to Address Cloud Security Challenges

 

 



Source link

Why Cloud-based DCIM is not Just for Data Centers | IT Infrastructure Advice, Discussion, Community


Just as technology and its use are evolving at a tremendous pace, the physical infrastructure which supports IT equipment is also being transformed to support these advances. There are some significant trends driving new approaches to the way technology is being deployed, but there are also important ramifications for the way that the basics – power, cooling, space – have to be provisioned and, more importantly, managed.

Firstly, a massive shift towards hybrid infrastructure is underway, says Gartner. The analyst predicts that by 2020, cloud, hosting, and traditional infrastructure services will be on a par in terms of spending. This follows on from earlier research which indicates an increase in the use of hybrid infrastructure services. As companies have placed an increasing proportion of IT load into outsourced data center services and cloud, both the importance and proliferation of distributed IT environments have been heightened.

Secondly, the IoT – or more specifically the Industrial IoT – has quietly been on the rise for a couple of decades. While industrial manufacturing and processing have utilized data for some time in order to maintain their ability to remain competitive and ensure profitability, companies must continually strive to optimize efficiency and productivity. The answer is being sought through more intelligent and more automated decision-making – most of it data-driven – with the data almost exclusively gathered and processed outside traditional data center facilities.

Thirdly, rapidly developing applications such as gaming and content streaming, as well as emerging uses like autonomous vehicles require physical resources which are sensitive to both latency and bandwidth limitations. Closing the physical distance between data sources, processing and use, is the pragmatic solution, but it also means that centralized data centers are not the answer. Most of the traction for these sorts of services is where large numbers of people reside – exactly where contested power, space and connectivity add unacceptable cost for large facility operations.

The rise of distributed IT facilities and edge data centers

In each of these examples – and there are more – IT equipment has to be run efficiently and reliably. Today there’s little argument with the fact that the best way to enable this from an infrastructure point of view is within a data center. Furthermore, the complexity of environments and the business criticality of many applications means that data center-style management practices need to be implemented in order to ensure that uptime requirements are met. And yet, data centers per se only partially provide the answer, because distributed IT environments are becoming an increasingly vital part of the mix.

The key challenges that need to be resolved where multiple edge and IT facilities are being operated in multiple or diverse locations include visibility, availability, security, and automation – functions which DCIM has a major role in fulfilling for mainstream data centers. You could also add human resource to the list, because most data center operations, including service and maintenance, are delivered by small and focused professional teams. When you add the complication of distributed localities, you have a recipe for having the wrong people in the wrong place, at the wrong time.

Cloud-based DCIM answers the need for managing Edge Computing infrastructure

DCIM deployment in any network can be both complex and potentially high cost (whether delivered using on-premise or as-a-service models). By contrast, cloud-based DCIM, or DMaaS (Data Center Management-as-a-Service), overcomes this initial inertia to offer a practical solution for the challenges being posed. Solutions such as Schneider Electric EcoStruxure IT enable physical infrastructure in distributed environments to be managed remotely for efficiency and availability using no more than a smartphone.

Access Edge Computing White Paper

DMaaS combines simplified installation and a subscription-based approach coupled with a secure connection to cloud analytics to deliver smart and actionable insights for the optimization of any server room, wiring closet or IT facility. This means that wherever data is being processed, stored or transmitted, physical infrastructure can be managed proactively for assured uptime and Certainty in a Connected World.

Read this blog post to find out more about the appeal of cloud-based data center monitoring, or download our free white paper, “Why Cloud Computing is Requiring us to Rethink Resiliency at the Edge.

 



Source link

LVFS Could Be Hosting 10k+ Firmware Files By End Of 2019


HARDWARE --

LVFS, the Linux Vendor Firmware Service, that pairs with Fwupd integration for offering firmware/BIOS updates to Linux users could be offering up more than ten thousand distinct firmware files before the end of the calendar year.

Richard Hughes of Red Hat who has been leading Fwupd/LVFS development has been quite busy as of late. In addition to courting more hardware vendors, eyeing the enterprise, becoming a Linux Foundation project, and hitting a goal of serving more than 500,000 firmware files to Linux users in a single month, this year they are on a trajectory to be offering more than ten thousand different firmware files.

Hughes noted in a mailing list post that they have grown from dozens of firmware files to thousand and “tens of thousands of files before the year is finished.”

That’s quite an ambitious goal and we’ll certainly be monitoring its progress. This goal was mentioned as part of some shell / user experience improvements to the LVFS given the growing number of firmware offerings.