How Network Security is Changing

November 29, 2018 at 8:00 AM

 

With more users providing more information on more networks, and some of those networks being breached, it’s no wonder that network security is a hot topic for many people and businesses. In fact, network security is a top concern for many IT administrators. For, as network technology has become more advanced, so have the ways in which those networks can be hacked. Here, we’ll take a look some of the changes happening in network security to counteract those threats.

 

·        Network security will no longer be a small branch of IT, but rather a responsibility that encompasses the entire organization. The network is a part of every aspect of IT and thus it will need to be protected by all administrators. Plus, the network itself should be armed with tools to help avoid attacks by reinforcing network-based security such as firewalls and malware prevention.  

 

·        Today, so many things are being sent to the Cloud, and so will security. Many IT departments will take advantage of cloud-based services that will manage common security tasks as part of a contracted agreement. This allows a simple subscription plan to impart dynamic security measures that once took several hours for administrators to manage. Multi-Cloud security management platforms will also come into play to allow network security to be managed across public and private clouds via a single security control plane.

 

·        We’ve all seen the IT email alert go out warning of an email scam sent to employees to get them to click a link that then installs malware on the network. Because of these types of attacks, end-point security has become a priority. This has extended to include mobile device management for non-corporate owned devices. In the future, many of these end-point security measures will be simplified and condense the number of protection tools in place to provide better overall protection.

 

·        Penetration (pen) testing is a valuable tool used to detect security gaps that might not have been identified by IT staff. Outside firms are often hired to perform this testing for a true outside party perspective, but the service can be costly and needs to be done in a timely manner. With the development of new AI and automation technology, pen testing is becoming easier and less expensive, meaning that it can be done more often, which is especially beneficial to keep up with the increase of new viruses, malware and other cyber threats.

  

With these upgrades in network security, along with an increased focus on overarching enterprise wide security procedures, today’s networks will be able to keep up with tomorrow’s technology and do it securely.

 

5 Tips for Building a Modern Data Center

November 8, 2018 at 8:00 AM

 

The world of technology is always changing, and the same goes for data centers too. Data centers play a critical role in networking and have evolved to allow businesses better access to their data with improved usability while being easier to manage. Now, they must also adapt to be cost-effective, efficient and responsive with the technology they support. Accelerating business demands beg for more storage and resources, and new technologies require improved infrastructure. Gone are the days of bigger is better, what we need now are smarter, more efficient, easier to manage and scalable data centers. Here are 5 ways to usher in the modernization of the data center:

 

1.      Make Appliances Multi-Task

 

Instead of having dedicated resources for only one purpose, combining the computing and storage capabilities of devices into one makes data centers more economical and efficient.  This allows the data center to be designed with a single tier that can fulfill all of the server and storage needs of any application. Plus, it improves scalability without acquiring additional or specialized equipment.

 

2.      Flexibility is Key

 

Being nimble and flexible are important qualities to have in today’s world of technology, and that applies to data centers too. A modular data center design is more flexible, simpler and allows for easy additions/removals as needed, which allows for better management of resources. One approach that’s gaining traction is combining storage and computing tiers into a single appliance (as referenced in tip #1), the overall data center design is streamlined  into a single console which makes management easier than ever.

 

3.      Consumer Focus

 

In addition to multi-tasking and flexibility, data centers also have to be more resilient and reliable than before. Today’s data centers have to support traditional technology as well as newer virtual desktops infrastructures (VDI) and mobile devices. This has led to a more consumer-focused approach that allows employees to access desktops, applications and data from within the data center via any device, anywhere. This modern approach also allows IT admins to better manage a rage of consumer-based workload demands as well as VDI systems, storage data services and existing virtual applications.

 

4.      Cloud Fusion

 

To be able to keep essential business applications safe within a private data center, while also being able to access the public cloud for other things, hybrid clouds are the way to go. Hybrid cloud environments are able to offer the best of both worlds by fusing the public cloud benefits of on-demand resource-sharing with multiple users, with the security, service and performance of private clouds. It’s a win-win.

 

5.      Continuation of Service

 

Most data centers have a plan for disaster recovery, but what about an interruption of service or latency issue? Users are accustomed to being able to access their data quickly and whenever they need it, a connection slowdown, or complete shutdown, can lead to employees using unauthorized cloud-based services. Thus, in addition to a disaster recovery, admins should have a plan to provide service continuity as well. This means designing data centers to be easily available with high-bandwidth and low round-trip times. Distributing applications across multiple sites, geographic regions or data centers can also be helpful, plus it improves scalability and performance.

 

As with everything in the world of technology, there are always upgrades to be made, and data centers are not immune to the need for improvement. With a few tweaks and a slightly different perspective, data centers can modernize their operations to best support the needs of today’s users.

 

Evolution of the Data Center

August 2, 2018 at 8:00 AM

 

Just as computers, phones and everything else in our world has made advancements over the years, so have data centers. Data centers play a critical role in networking and have evolved to allow businesses better access to their data with improved usability while being easier to manage. Traditionally, data centers were only as good as the physical space they took up, they were restricted by server space and having enough room for hardware to be stored. With today’s technological advancements, they are less space-centric, and more focused on the cloud, speed and flexibility. Here, we’ll take a look at the evolution of the data center, from inception to the realm of future possibilities.

 

The Early Days

 

The earliest data centers were large rooms filled with computers that were difficult to operate and maintain. These primordial behemoths needed a special environment to keep the equipment running properly – equipment that was connected by a maze of cables and was stacked on racks, cable trays and raised floors. Early data centers also used a large amount of power and generated a lot of heat, so they had to be cooled to keep from overheating. In 1964, the first supercomputer, the ancestor to today’s data centers, was built by Control Data Corporation. It was the fastest computer of its time with peak performance of 3 MFlops, sold for $8 million and continued operating until 1977.

 

1970s

 

The 1970s brought the invention of the Intel 4004 processor which allowed for computers to be smaller. And in the 1973, the first desktop was introduced, the Xerox Alto. Although it was never sold commercially, it was the first step toward eliminating the mainframe. The first LAN was brought to life in 1977 in the form of ARCNET, which allowed computers to connect to one another with coax cables that linked to a shared floppy data storage system.

 

1980s

 

The personal computer (PC) was born in 1982, with the introduction of the IBM model 5150. This new, smaller computer was a far cry from the expensive and expansive mainframes that were hard to cool. PCs allowed organizations to use desktop computers throughout their companies much more efficiently, leading to a boom in the microcomputer industry. Plus, in 1985, Ethernet LAN technology was standardized, largely taking the place of ARCNET. 

  

1990s

  

The 1990s started with microcomputers working as servers and filling old mainframe storage rooms. These servers were accumulated by companies and managed on premise, they were knows as data centers. The mid-90s saw the introduction of the Internet, and with it came a demand for faster internet connections, increased online presence and network connectivity as a business requirement. To meet increased demands, new, larger scale, enterprise server rooms were built with data centers that contained hundreds or thousands of servers working around-the clock.  In the late 1990s, virtualization technology originally introduced in the 80s was revisited with a new purpose in the form of a virtual workstation, which was comparable to a virtual PC. Enterprise applications also became available for the first time through an online website.

 

2000s 

 

By the early 2000s, PCs and data centers has grown exponentially. New technology was quickly emerging to allow data to be transmitted easier and faster. The first cloud-based services were launched by Amazon Web services, which included storage, web services and computation. There was also a growing realization of the power required to run all of these data centers, so new innovations were being introduced to help data centers be more energy efficient. In 2007, the modular data center was launched. One of the most popular was from Sun Microsystems, which has 280 servers in a 20-foot shipping container that could be sent anywhere in the world. This offered a more cost effective way for corporate computing, but also refocused the industry on virtualization and ways to consolidate servers.

 

2010s

 

By the 2010s, the Internet had become engrained in every part of day-to-day life and business operations. Facebook had become a main player and began investing resources in trying to find ways to make data centers more cost and energy efficient across the industry. Plus, virtual data centers were common in almost 3/4 of organizations and over 1/3 of businesses were using the Cloud. The focus then shifted to software-as-a-service (SaaS), with subscription and capacity-on-demand being the main focus, instead of infrastructure, software and hardware. This model increased the need for bandwidth and the creation of huge companies providing access to cloud-based data centers, including Amazon and Google.

 

Today, the Cloud appears to be the path we are headed on, with new technology being introduced and the implementation of the IoT becoming more of a reality every day. We’ve definitely come a long way from the first gigantic mainframe data centers, one can only imagine what the next 60 years of innovation will bring.

 

How Data Centers Can Prepare for a Natural Disaster

May 31, 2018 at 8:00 AM

 

We’ve learned that the Cloud isn’t actually floating in the sky, it’s actually thousands of data centers full of servers that store and transmit all of the data we use every day. But what happens when a data center is affected by a natural disaster? In this post, we’ll take a look at what defensive strategies are used to keep our data safe and our cloud aloft, even in the worst circumstances.

 

From blizzards and hurricanes to floods and fires, we seem to have seen a large number of natural disasters in recent history. Fortunately, data centers are prepared with plans to maintain Internet connections and keep your data accessible even in the worst conditions. By having preparedness plans in place, staff willing to stay at their posts and generators to provide power, key data centers can withstand record-breaking hurricanes and even serve as evacuation shelters for citizens and headquarters for law enforcement.

 

Here are some ways data centers can prepare for natural disasters:

 

Make a Plan

The best line of defense is having a good offense - having a plan in place, testing that plan, having a plan B for when that plan fails and then being ready to improvise. When it comes to Mother Nature, even the most prepared have to roll with the punches as things change.

 

Build a Fortress

The ideal structure to house your data center will be impenetrable. That might be too much to ask, but newly constructed buildings can be made to withstand earthquakes, flood, fire or explosion. The addition of shatterproof/explosive-proof glass, reinforced concrete walls and being in a strategic location outside flood zones can also provide an extra layer of protection.

 

These additional precautions might not be possible in older buildings, but there are still steps you can take to help protect your data center:

 

·       Move hardware to a safer location if possible:

    - Ideally, a data center should be away from windows, in the               center of a building and above ground level

    - Higher floors are better, except in an earthquake zone, then               lower floors are safer

·       Install pumps to remove water and generators to keep the pumps          running

·       If there are windows, remove objects that could become airborne

·       Fire extinguishing systems should be checked regularly

 

Redundancy is Key

Hosting all data in one place is opening the door for disaster. A safer option is to host it in multiple locations at redundant centers that can back each other up if disaster strikes one or more facilities. These centers don’t have to be on opposite ends of the world, but putting them in different geographic regions is probably the safest bet. They should be far enough apart that one disaster won’t take them all out.

 

Back That Data Up

If there’s no time to back up data to the Cloud, making a physical backup of the data and sending it with someone who’s evacuating is a good second option.

 

Natural disasters are unavoidable, and the most important asset to keep safe is always the people working inside the data center, but with a plan in place to keep Mother Nature at bay, you might be able to salvage the data center too.

 

5 Things You Need to Know About the Cloud

August 31, 2017 at 8:00 AM

 

If you’re like most people, you probably have pictures or some other type of files stored in the cloud, but do you really understand what the cloud is? For many people, the cloud remains a mystical place that they still can’t quite comprehend. Here are 5 things you need to know about the cloud:

 

1.  When something is stored in the cloud, it is actually in a physical place. The cloud is like a giant IT data center, it is a massive infrastructure of thousands of servers that are connected by cables, switches, connectors and patch panels. All of these parts work together to store data, provide virtual desktops, global data access and more.

 

2.  Cloud computing relies on many geographically dispersed servers that provide millions of people with reliable and limitless access to their library of and images, video, audio and data files through the Internet. This frees up local RAM and hard drive space, but it also means that the interconnect components that make up the cloud need to be fast and dependable to keep up with user demand.

 

3.  The consumer cloud is different than a cloud for business. Consumer cloud computing is for those using cloud Internet services casually at home or in small offices. When it comes to business, there are several cloud models being used:

 

-  Software-as-a-Service (SaaS) – businesses subscribe to an application that is accessed using the Internet

-  Platform-as-a-Service (PaaS) – businesses create their own custom application for everyone in the company      to use

-  Infrastructure-as-a-Service (IaaS) – the big names in tech (Amazon, Google, Microsoft, etc.) provide a              backbone that can be used or “rented” for use by other companies

 

4.  The cloud is big business and is having a big impact on business. Worldwide public cloud services are anticipated to grow 18% this year to reach $246.8 billion. Cloud computing is also expected to be the most measurable factor impacting businesses in 2017. Cloud platforms allow for more complex business models and coordination of globally integrated networks – more so than many experts predicted. Cloud services are also increasingly being used by small and medium businesses, which is also increasing the revenue forecast.

 

5.  The Internet of Things (IoT) continues to grow, and with the IoT has come increased use of cloud computing technology. Eventually, IoT devices may become extensions of cloud data centers.  The cloud is a powerful force in the technology industry and a global trend that doesn’t seem to be slowing down.

 

Posted in: Wireless

Tags: , , , , ,

© L-com, Inc. All Rights Reserved. L-com, Inc., 50 High Street, West Mill, Third Floor, Suite 30, North Andover, MA 01845