Press & News

GDT honored with two (2) high-profile awards at Cisco Partner Summit

GDT, a leading IT integrator and data center solutions provider, announced today that it received two (2) coveted awards at Cisco’s 2018 Partner Summit, held November 13 – 15 in Las Vegas. Cisco honored GDT, a Cisco Gold partner, as its South Region Partner of the Year (POY) in both Software Lifecycle Management and Architectural Excellence in Collaboration.

Each year at its Partner Summit, Cisco, one (1) of the IT industry’s prominent technology organizations, innovators and thought leaders, recognizes partners that have not only developed innovative solutions, but have deployed them for customers to help them address and realize business goals. 

“GDT is both humbled and honored to be recognized by Cisco as its Partner of the Year in these two (2) key initiatives,” said GDT President Vinod Muthuswamy. “We take tremendous pride and satisfaction in knowing that a leading technology solutions provider like Cisco values and acknowledges our tireless efforts to help customers achieve their digital transformation goals.”

About GDT

Founded in 1996, GDT is an award-winning, international multi-vendor IT solutions provider and Cisco Gold Partner. GDT specializes in the consulting, design, deployment and management of advanced technology solutions for businesses, service providers, government agencies and healthcare organizations. The GDT team of expert solutions architects and engineers maintain the highest certification levels in the industry, which helps them translate the latest ideas and technologies into innovative solutions that realize the visions of business leaders and help them achieve their digital transformation goals.

Dallas Technology Integrator GDT Names Eric Power Vice President of Sales, Central United States

Dallas, TX – Dallas-based technology and systems integrator GDT announced today that Eric Power has been named Vice President of Sales, Central United States, effectively immediately. In his new role, Power will be focused on expanding GDT’s customer base from the southern tip of Texas to the central region’s northernmost markets.

Power joins GDT after almost twenty (20) successful years at Cisco as both the top performing account manager in US Commercial sales and the eight-year leader of Cisco’s top performing Mid-Market sales team. In addition, he was requested to conduct front-line leadership training for Cisco’s US Commercial Segment in each of his last five (5) years, and to expand the program globally via training videos.

“We’re very fortunate and excited to welcome Eric to the GDT family,” said GDT President Vinod Muthuswamy. “His proven 100 +10 approach to sales leadership―giving one hundred percent (100%) effort and spending at least ten percent (10%) of their time helping others―fits perfectly with GDT’s corporate culture and customer-first focus.”

The Dallas Business Journal named Power to its prestigious “Top 40 Under 40 Dallas Executives” list due to both his professional success and considerable efforts outside the office. Power has coached a combined fifty (50) seasons of youth sports and has been a Boy Scouts of America leader for over ten (10) years. In addition, he has served as President and spokesperson for Strikes Against Cancer, a non-profit organization that produces baseball tournaments throughout North Texas that donates money for each strike thrown to help families fighting cancer and to fund cancer research.

Power has been married for over twenty (20) years to his wife Aleisha, and they have two (2) sons, Ethan and Coleton. Power holds a Bachelor of Science (B.S.) degree from the University of North Texas in Denton, Texas.

About GDT

Founded in 1996, GDT is an award-winning, international multi-vendor IT solutions provider and Cisco Gold Partner. GDT specializes in the consulting, design, deployment and management of advanced technology solutions for businesses, service providers, government agencies and healthcare organizations. The GDT team of expert solutions architects and engineers maintain the highest certification levels in the industry that help them translate the latest ideas and technologies into innovative solutions that realize the visions of business leaders and help them achieve their digital transformation goals.

GDT achieves highest sales of Cisco products and services in its 20+ year history

Dallas, TX – GDT, a leading IT integrator and data center solutions provider, announced today that it achieved record sales of Cisco products and services for Cisco’s fiscal year 2018, which ended on July 31st. Cisco was the very first partner of GDT, which was started in 1996 by founder and owner J.W. Roberts.

“Our long-term partnership with Cisco is one of the key components that has helped build GDT into the company it is today,” said Roberts. “These record revenue numbers are testament to our strong Cisco relationship, our unwavering belief in their superior products and services, and our ongoing commitment to deliver best-of-breed solutions to GDT customers.”

GDT’s YTD 2018 growth has been due in part to tremendous sales increases in several key areas, including service provider, software, collaboration, enterprise networking and security. At a time when the IT industry is experiencing overall growth of less than 5 percent, GDT’s double-digit growth of Cisco products and software speaks volumes to its commitment to help customers achieve their digital transformation goals.

About GDT

Headquartered in Dallas, TX and with approximately 700 employees, GDT is a global IT integrator and solutions provider approaching $1 Billion in annual revenue. GDT aligns itself with industry leaders, providing the design, build, delivery and management of IT solutions and services. GDT specializes in the consulting, designing, deploying, and managing of advanced technology solutions for businesses, service providers, government, and healthcare. The GDT team of expert architects and engineers maintain the highest level of certifications to translate the latest ideas and technologies into innovative solutions that realize the vision of business leaders.

           

GDT achieves Advanced-Level AWS Partner Network (APN) status

Dallas, TX – GDT, a leading IT integrator and data center solutions provider, announced today that it achieved Advanced Level status within the elite AWS (Amazon Web Services) Partner Network (APN), and has also been awarded entry into the AWS Public Sector Partner Program. Advancement within APN is based on revenue generation, commitment to training, and the number and quality of customer engagements.

“Our partnership with AWS has been a very rewarding experience for GDT on a number of levels,” said Vinod Muthuswamy, GDT President. “Our ongoing commitment leading enterprise and public-sector customers on their digital transformation journey has been greatly enhanced by our close partnership with AWS. We are eagerly anticipating continued success in the future.”

The APN Consulting Partners Program is reserved for professional services firms that help customers design, build, migrate and manage their applications and workloads on AWS. APN Consulting Partners include Network System Integrators, Managed Service Providers (MSPs) and Value-Added Resellers (VARs), and are provided access to a range of resources that ultimately help their customers better deploy, run and manage applications in the AWS Cloud.

About GDT

Headquartered in Dallas, TX and with approximately 700 employees, GDT is a global IT integrator and solutions provider approaching $1 Billion in annual revenue. GDT aligns itself with industry leaders, providing the design, build, delivery and management of IT solutions and services. GDT specializes in the consulting, designing, deploying, and managing of advanced technology solutions for businesses, service providers, government, and healthcare. The GDT team of expert architects and engineers maintain the highest level of certifications to translate the latest ideas and technologies into innovative solutions that realize the vision of business leaders.

Dallas Technology Integrator GDT Names Troy Steele Director of Staffing Services

Dallas, TX – Dallas-based technology and systems integrator GDT today announced that Troy Steele has been named Director of Staffing Services, effectively immediately. In his new role, Steele will oversee and direct GDT’s staff augmentation practice, which has a 20-year track record helping customers improve operational efficiencies, reduce costs and drive key initiatives through the placement of IT professionals with the right skillsets.

Steele has spent the past twelve (12) years in the staffing industry, and has a proven track record building highly profitable staffing organization by understanding clients’ specific needs, corporate philosophies and organizational nuances.

“We’re excited to welcome Troy to GDT,” said Meg Gordon, GDT’s Vice President of Service Operations. “His experience and expertise building successful staffing organizations will greatly enhance our focus on growing GDT’s staff augmentation practice by continuing to provide the perfect candidates to fill customers’ IT staffing needs and requirements.”

Prior to joining GDT, Steele held several executive staffing positions, most recently with Beacon Hill Staffing, where he spent eight (8) years leading technical recruiting teams throughout Texas. Steele has a Bachelor of Arts in Communications from Southern Illinois University in Edwardsville, Illinois.

About GDT

Founded in 1996, GDT is an award-winning, international multi-vendor IT solutions provider and maintains high-level partner status with several of the world’s leading IT solutions and hardware providers, including HPE, Cisco and Dell EMC. GDT specializes in the consulting, designing, deploying, and managing of advanced technology solutions for businesses, service providers, government, and healthcare. The GDT team of expert architects and engineers maintain the highest level of certifications to translate the latest ideas and technologies into innovative solutions that realize the vision of business leaders.

GDT honored as one of the top technology integrators in CRN’s 2018 Solutions Provider 500 List

Dallas, TX – GDT, a leading IT integrator and data center solutions provider, announced today that CRN®, a brand of The Channel Company, has named GDT as one of the top 50 technology integrators in its 2018 Solution Provider 500 List. The Solution Provider 500 is CRN’s annual ranking by revenue of the largest technology integrators, solution providers and IT consultants in North America.

“GDT is very proud to have earned our high ranking on CRN’s 2018 Solutions Provider 500 List,” said GDT President Vinod Muthuswamy. “It’s humbling to be listed with so many highly touted and respected companies, and our inclusion is further proof of our steadfast commitment to delivering digital transformation solutions for our customers.”

CRN has been providing The Solution Provider 500 List since 1995, and is the predominant channel partner award list in the industry. It highlights those IT channel partner organizations that have earned the most revenue in 2018, and is a valuable resource utilized by vendors looking for top solution providers with which to partner. This year’s list is comprised of companies that have a combined revenue of over $320 billion.

The complete 2018 Solution Provider 500 list is available online at www.crn.com/sp500. The complete list is published on CRN.com, and is available to technology vendors seeking out the top solution providers with which to work.

About GDT

Headquartered in Dallas, TX and with approximately 700 employees, GDT is a global IT integrator and solutions provider approaching $1 Billion in annual revenue. GDT aligns itself with industry leaders, providing the design, build, delivery and management of IT solutions and services. GDT specializes in the consulting, designing, deploying, and managing of advanced technology solutions for businesses, service providers, government, and healthcare. The GDT team of expert architects and engineers maintain the highest level of certifications to translate the latest ideas and technologies into innovative solutions that realize the vision of business leaders.

 

Dallas Technology Integrator GDT Names Adnan Khan Director of Cloud and DevOps

Dallas, TX – Dallas-based technology and systems integrator GDT today announced that Adnan Khan has been named Director of Hybrid Cloud and DevOps, effectively immediately. In his new role, Khan will provide technical leadership to the architecture, design and management of GDT’s software development practice, and expand on its many cloud-related initiatives.

Khan has extensive, hands-on software development leadership experience utilizing lean practices, such as Agile/Scrum. With over 15 years of experience working in high performance distributed practices, Khan is particularly skilled in the following IT technologies: Wireless WAN (CDMA and GSM), Storage Area Networking (SAN), Network Attached Storage (NAS), Android- based applications, Location-based services, Cloud Computing, SaaS, Blockchain, Cryptocurrency and the Internet of Things (IoT) for both consumers and the enterprise market.

“We’re excited to welcome Adnan to GDT’s team of talented, forward-thinking IT engineers and professionals,” said Brad Davenport, GDT Vice President of Solutions Engineering. “We know his tremendous experience, wide-ranging technological expertise and unique skillsets will prove invaluable to GDT.”

Prior to joining GDT, Khan held several senior-level management positions in the IT industry, and has overseen many on- and offshore teams that consistently delivered complex software solutions, from inception to deployment. Many of those solutions are currently being used by millions of customers of some of the most noteworthy wireless carriers in the world.

Khan holds a MBA from the University of California at Irvine’s Paul Merage School of business, and a Master’s Degree in Computer Science from Pakistan’s Karachi University. In addition, Khan holds several IT-related patents.

About GDT

Founded in 1996, GDT is an award-winning, international multi-vendor IT solutions provider and maintains high-level partner status with several of the world’s leading IT solutions and hardware providers, including HPE, Cisco and Dell EMC. GDT specializes in the consulting, designing, deploying, and managing of advanced technology solutions for businesses, service providers, government, and healthcare. The GDT team of expert architects and engineers maintain the highest level of certifications to translate the latest ideas and technologies into innovative solutions that realize the vision of business leaders.

GDT and CloudFabrix to Jointly Offer NextGen IT Transformation Services

Dallas, TX – GDT, a leading IT integrator and data center solutions provider, and CloudFabrix, an AIOps Software vendor, have joined forces to accelerate the IT transformation journey for customers with next generation managed services built on the CloudFabrix cfxDimensions AIOps platform. As a result, GDT will enhance its current managed services offerings, which include cloud, hybrid IT, IoT and customized DevOps solutions. Ideal for VARs and MSPs, the CloudFabrix AIOps platform provides product and services suites for enterprise customers and MSPs, and offers a wide array of foundational capabilities, including any-time, any-source Data Ingestion, Dynamic Asset Discovery, Advanced Analytics, Machine Learning and Blockchain, among others.

The CloudFabrix AlOps platform, which addresses cloud, security and architectural needs, also provides implementation services and enterprise support to VARs and MSPs, all of which greatly reduces partners’ time to value (TtV). Now GDT, when combined with its tremendous engineering skillsets and vast experience providing managed services to customers of all sizes and from a wide range of industries, will be able to further enhance what it’s provided to customers for over 20 years―the delivery of highly innovative IT solutions with a customer-first focus.

“CloudFabrix has already enabled GDT to address many of the architectural and security needs of our customers,” said GDT President Vinod Muthuswamy. “And that, combined with our experience delivering managed services, cloud, hybrid IT, IoT and customized DevOps solutions to customers, will accelerate and improve upon our ability to provide innovative technological solutions that ultimately help customers work on the projects that will help shape their organization’s future.”

Said CloudFabrix Chief Revenue Officer Kishan Bulusu, “We are excited about working closely with GDT, a network integrator that’s made a tremendous name for itself in the managed services, cloud and hybrid IT space. The initiatives we’ve developed at an organic level will not only enhance GDT’s service offerings, but better serve the MSP community at large. Partnering with GDT will also help CloudFabrix enhance our product and platform offerings, and allow us to focus on NextGen technological and architectural capabilities. This will ultimately help CloudFabrix better address and serve the unique needs of our partners’ customers.”

About GDT

Headquartered in Dallas, TX and with approximately 700 employees, GDT is a global IT integrator and solutions provider approaching $1 Billion in annual revenue. GDT aligns itself with industry leaders, providing the design, build, delivery and management of IT solutions and services. GDT specializes in the consulting, designing, deploying, and managing of advanced technology solutions for businesses, service providers, government, and healthcare. The GDT team of expert architects and engineers maintain the highest level of certifications to translate the latest ideas and technologies into innovative solutions that realize the vision of business leaders.

 About CloudFabrix

CloudFabrix enables Responsive & Business aligned IT by making your IT more agile, efficient and analytics driven. CloudFabrix accelerates enterprises to holistically develop, modernize and govern IT processes, applications and operations to meet business outcomes in a consistent and automated manner. CloudFabrix AIops Platform simplifies and unifies IT operations and governance of both traditional and modern applications across multi-cloud environments. CloudFabrix accelerates enterprise’s cloud native journey by providing many built-in foundational services and turnkey operational capabilities. CloudFabrix is headquartered in Pleasanton, CA.

GDT Wins VMware 2017 Regional Partner Innovation Award

Partners Awarded for Extraordinary Performance and Notable Achievements

GDT today announced that it has received the Americas VMware Partner Innovation Award for the Transform Networking & Security category. GDT was recognized at VMware Partner Leadership Summit 2018, held in Scottsdale, AZ.

“We congratulate GDT on winning a VMware Partner Innovation Award for the Transform Networking & Security category, and look forward to our continued collaboration and innovation,” said Frank Rauch, vice president, Americas Partner Organization, VMware. “VMware and our partners will continue to empower organizations of all sizes with technologies that enable digital transformation.”

GDT President Vinod Muthuswamy said, “GDT is honored to have received the Americas VMware Partner Innovation Award in the Networking & Security category. It’s humbling to know our innovation and focus in network and security transformation is being recognized by leaders like VMware. Our close partnership with VMware is greatly enabling our customers to realize their Hybrid IT and digital transformation vision and goals.”

Recipients of an Americas VMware Partner Innovation Award were acknowledged in 14 categories for their outstanding performance and distinctive achievements during 2017.

Americas Partner of the Year Award categories included:

  • Cloud Provider
  • Emerging Markets Distributor
  • Empower the Digital Workspace
  • Integrate Public Clouds
  • Marketing
  • Modernize Data Centers
  • OEM
  • Professional Services
  • Regional Distributor
  • Regional Emerging Markets Partner
  • Solution Provider
  • Transform Networking & Security
  • Transformational Solution Provider
  • Technology

About VMware Partner Leadership Summit 2018

VMware Partner Leadership Summit 2018 offered VMware partners the opportunity to engage with VMware executives and industry peers to explore business opportunities, customer use cases, solution practices, and partnering best practices. As an invitation-only event, it provided partners with resources to develop and execute comprehensive go-to-market plans.  VMware Partner Leadership 2018 Summit concluded with award ceremonies recognizing outstanding achievements in the VMware partner ecosystem.

About GDTHeadquartered in Dallas, TX with approximately 700 employees, GDT is a global IT integrator and solutions provider approaching $1 Billion in annual revenue. GDT aligns itself with industry leaders, providing the design, build, delivery and management of IT solutions and services. GDT specializes in the consulting, designing, deploying, and managing of advanced technology solutions for businesses, service providers, government, and healthcare. The GDT team of expert architects and engineers maintain the highest level of certifications to translate the latest ideas and technologies into innovative solutions that realize the vision of business leaders.

# # #

VMware is a registered trademark of VMware, Inc. in the United States and other jurisdictions.

 

Dallas Network Integrator GDT’s Spring Fling Bar-B-Que Results in $10,000 Donation to New Horizons of North Texas

Dallas, TX – Dallas-based technology and systems integrator GDT announced at its Annual Spring Fling Bar-B-Que, May 3rd and 4th, that New Horizons of North Texas will receive this year’s $10,000 winner’s donation.

GDT’s Annual Spring Fling Bar-B-Que was started in 2014 by GDT CEO J.W. Roberts to further the company’s fun atmosphere while benefiting local charities. The event pits ten (10) GDT Account Executives competing against each other to determine who can smoke the best brisket and ribs. Each cross-departmental team was comprised of GDT technology partners, including Cisco, HPE, Dell EMC, Pure Networks, VMware, Veeam, Juniper Networks, Hypercore Networks, Cohesity, QTS, APS, Jive Communications and Global Knowledge.

The Spring Fling Bar-B-Que is centered around a 19-hour, highly competitive cooking event, featuring state-of-the-art smokers, secretive, pre-event meetings, and closely guarded recipes. It’s a great event full of food and fun, and provides the perfect environment for camaraderie and relationship building for the over 300 GDT employees in Dallas. And, of course, a winner is awarded who unveils their selected charity to receive the $10,000 donation. GDT Account Executive Chris Bedford, who captained the winning team, selected New Horizons of North Texas.

Said Bedford, a 20-year GDT veteran, “Our annual Spring Fling Bar-B-Que is one of the many marquee―and outrageously fun―events our marketing team produces each year, but being able to donate $10,000 to a great organization like New Horizons of North Texas makes it even more special.”

GDT’s Annual Spring Fling and Bar-B-Que is one of many examples of the company’s work hard, play hard philosophy and its ongoing commitment to giving back to the D/FW community.

About New Horizons of North Texas

New Horizons is a faith-based 501(c)(3) nonprofit dedicated to serving at-risk youth growing up in situations of poverty and academic struggle. The mission of New Horizons of North Texas is to empower at-risk youth to reach their full potential with tutoring, mentoring, and faith-building. Hew Horizons works with a highly relational, individualized, and long-term approach to provide support for elementary students all the way through their high school graduation, while providing over 250 hours of mentorship to each child each year. Visit www.newhorizonsofntx.org to learn more about New Horizons.

About GDT

Founded in 1996, GDT is an award-winning, international multi-vendor IT solutions provider and maintains high-level partner status with several of the world’s leading IT solutions and hardware providers, including HPE, Cisco and Dell EMC. GDT specializes in the consulting, designing, deploying, and managing of advanced technology solutions for businesses, service providers, government, and healthcare. The GDT team of expert architects and engineers maintain the highest level of certifications to translate the latest ideas and technologies into innovative solutions that realize the vision of business leaders.

The Simplicity of SimpliVity

By Richard Arneson

GDT and HPE SimpliVity combine to create lemonade from a client’s prior, sour deployment

In the world of technology, today’s advancements can quickly become tomorrow’s obsolescence. No organization understands this better than one (1) of the country’s largest energy companies. They needed to refresh their entire IT environment, which they believed they’d already accomplished through their IT organization. What they soon discovered, however, is that a solution, or in this case a re-fresh, doesn’t truly address the issue of obsolescence if the latest and greatest solution isn’t implemented correctly. GDT has seen this before. And, like they have thousands of times over the past twenty-three (23) years, their engineering team transformed the customer’s sour experience into the sweet taste of technological success.

What went wrong?

Unfortunately, the customer hadn’t initially conscripted the help of one (1) of the IT industry’s most knowledgeable and experienced collection of engineering experts. In addition to the solution they had implemented, they had already begun testing less-than-ideal solutions to remediate the issue. There’s a phrase for that: “throwing good money after bad.”

Through interviews with the customer’s key stakeholders and decision-makers, GDT’s engineering team discovered that the client’s current “solution” didn’t provide them the ability to easily and efficiently replicate data between their production environment and a Disaster Recovery (DR) site. And the solution’s resource-intensive deduplication process, which analyzes incoming traffic and stores only what doesn’t already exist, didn’t accommodate their need for an expedient solution. In short, their critical de-dupe process was slow.

An aptly named solution

As one (1) of HPE’s premier partners, GDT is in a unique position that few of its partners have earned and enjoy—the ability and expertise to quickly design, deliver and deploy HPE solutions. GDT engineering knew exactly what the customer needed, and, better still, had the empirical experience to perfectly deploy it–HPE SimpliVity. HPE SimpliVity is a pre-integrated, all-flash, hyperconverged solution that simplifies IT by combining all infrastructure and advanced data services for virtualized workloads, including VM-centric management and mobility, data protection and guaranteed data efficiency.

The Simplicity of SimpliVity

HPE SimpliVity has earned the simple in its name for a number of reasons, but for this customer it meant managing the solution through its existing VMware dashboard and management environment. By installing a simple plug-in on VMware’s management platform, vCenter, this new HPE SimpliVity customer can easily and seamlessly enjoy its new solution. And, because the customer is considering a Microsoft Hyper-V migration in the future, it was relieved to know that HPE SimpliVity can accommodate that virtualization software solution, as well. The customer’s deduplication issue, due in large part to a particularly cumbersome application, was immediately addressed, as HPE SimpliVity can de-dupe and recover a stunning one (1) Terabyte (TB) of data in a matter of seconds. And with SimpliVity’s impressive, industry-leading 10:1 de-dupe ratio, the customer now enjoys ninety percent (90%) savings in its physical storage capacity.

If you have questions about how GDT’s talented engineers and solutions architects can digitally transform your organization, contact them at SolutionsArchitects@gdt.com or at Engineering@gdt.com. They’d love to hear from you.

Whatever happened to Google Fiber?

By Richard Arneson

Think back ten (10) years. If you can’t, I’ll do it for you. LeBron James broadcast live which team he would eventually dismember. South Africa hosted the World Cup and the planet was introduced to the vuvuzela, a noisemaker that all but ruined the TV coverage. And Tiger Woods returned to golf after his soon-to-be ex-wife bounced a 7-iron off his head a few times. And who could forget that Google Fiber launched in Kansas City—the first city of many planned—that would help bridge the digital gap between the haves and have nots.

What made Google Fiber sound so enticing was that it would extend fiber optics to the curbs of citizens who happened to live in one (1) of the six (6) initial cities the company hand-picked to serve. And don’t forget the speeds Google Fiber promised—up to a gigabit per second, which was a hundred times faster than the average speeds of the day. What could go wrong? It was going to be digital nirvana.

The National Broadband Plan

In 2009, Internet Service Providers’ (ISPs) plans to upgrade and expand their infrastructures stalled out. In the previous few years, ISPs had steadily provided consumers with internet innovations, including DSL and cable-based connectivity. And Verizon had rolled out FiOS, the first iteration of fiber-based connectivity to the home. The providers didn’t feel enough of a competitive threat to pump billions into their infrastructures. Most consumers were happy, at least the ones who had access to broadband. But the government wasn’t.

In 2009, the National Broadband Plan (NBP) was enacted to help ensure at least 100 million Americans had access to internet speeds of 100 Mbps by 2020. It shoved ISPs into build mode, and the NBP checked that 2020 goal off its list in 2016.

In response to the NBP, Google conducted research and submitted it to NBP stakeholders that showed the economic and competitive impact that not having a next-gen infrastructure would mean to a variety of areas, most notably smart grids, tele-health and distance-based learning. Apparently, Google ate its own dog food. They approached cities about participating in fiber-based, gigabit testbed networks. The response was staggering—Google had hoped they’d get a couple dozen cities take the bait. Instead, they received interest from over a thousand. They had backed their way into a market.

Big Bang Disruption. It’s a thing

Google Fiber benefited from what’s called the Big Bang Disruption model, in which innovative solutions meet a marketplace that’s ready for them. As a result, Internet Service Providers (ISPs) heard the banging on the door and began pumping money into building out their infrastructures faster than they had planned—a perfect example of technological disruption. And there was IoT looming on the horizon. Yes, faster speeds would be needed. Local governments even took notice and tried to accommodate ISPs by making it easier for them to enter new markets, and incumbent ones to more easily expand without all the red tape.

Google decided which cities to build into based on, basically, how easy they were to work with. They weren’t looking for tax breaks or financial incentives. They just wanted cities who would be more accommodating when it came time to dig up roads and sidewalks to lay fiber. They wanted more streamlined permitting processes and no political conflicts. They knew that more red tape meant more costs. They selected Kansas City, MO., Austin, TX., Salt Lake City, UT., Provo, UT, Atlanta, GA. and Charlotte, N.C. They added another five (5) until Google announced, just five (5) years later in 2016, that they were suspending further deployment.

A failure? For Google, maybe, but not for the rest of us

You can still get Google Fiber in eleven (11) cities, but its deployment fell well short of its initial prediction—fifty (50) cities. While it has all the earmarks of a failure, what it has provided the public at large can be described as an unmitigated success.

Google Fiber prompted ISPs to build next-gen networks much sooner than they had planned. Google would announce expansion plans to include a city, and incumbent ISPs serving that city would soon broadcast plans to pump dollars into the infrastructure and deliver higher speeds and lower prices to its residents.

And Google Fiber exposed the level of government red tape that prevented ISPs from upgrading existing networks or allow competitors to enter the market. Cities began to examine their own processes and procedures, knowing that without improvements ISPs would simply find other communities to serve. Those reforms resulted in better served residents, happier ISPs, more desirable accommodations to attract new businesses, and better PR. Win-win.

And don’t forget what all of this means to 5G, which we’re eagerly awaiting. Rental costs for pole attachments and rights of way, which used to be seen by city governments as nice little revenue streams, have been re-examined. They know if they’re seen as getting in the way of the people and their 5G, they’ll be shredded in the press. And at the polls.

Questions about Digital Transformation? Talk to the experts

If you’d like to learn about how GDT’s design engineers and solutions architects turn traditional, legacy infrastructures into innovative, agile machines that make customers more competitive, help them bring applications to market faster, and deliver a superior customer experience, contact them at SolutionsArchitects@gdt.com or at Engineering@gdt.com. They’d love to hear from you.

You can read more about how to digitally transform your infrastructure here:

Monetize space? Sure, if you’ve got a launch pad and a few thousand satellites in your garage

FTC cracks down on robocalls, but is apparently poor at collection calls

How Machine Learning is making you smarter on game day

Nvidia drops big chunk of change to round out its data center strategy

Intelligence limited only by your architectural imagination

Data growth— how to store all those 1’s and 0’s

Storage—software style

Thank God I’m a country boy…as long as I get broadband

A road less traveled…than you’d think

The four (4) horsemen of HCI

Who doesn’t want to Modernize?

Workshops uncover insights into the state of IT and Digital Transformation

What is Digital Transformation?

The only thing we have to fear is…definitely not automation

Without application performance monitoring, your IoT goals may be MIA

When implementing a new technology, don’t forget this word

Automation and Autonomics—the difference is more than just a few letters

Is Blockchain a lie detector for the digital age?

If you fall victim to it, you won’t end up marking it as “like”

They were discovered on Google Play, but this is no game

Blockchain; it’s more than just Bitcoin

When being disruptive is a good thing

GDT’s CX practice delivers—again—what others can’t

Another company learns what others have for years—the GDT and Cisco partnership doesn’t disappoint

By Richard Arneson

According to a recent survey concerning employee productivity, ninety-seven percent (97%) of respondents believe a lack of collaboration delays projects’ outcomes. It’s an eye-popping statistic. And for a regional cable TV and Internet service provider serving the upper Midwestern United States, it could relate. They thought they’d addressed this very issue when they selected a collaboration and communications provider that promised its solution would simplify communications and enhance productivity. It did neither, but in large part due to a lack of training that resulted in low adoption rates and the vast majority of its two thousand (2,000) employees wondering if the solution did just the opposite—complicate communications and lower productivity.

After conscripting the help of GDT’s Customer Experience (CX) professionals, the customer soon learned that a solution’s success is measured by effectiveness and employee adoption rates. By turning to GDT, the customer soon enjoyed both.

From interviews comes an Action Plan

As they always do, GDT’s tenured team of software lifecycle professionals with key stakeholder interviews, who relayed exactly what was needed to ensure the needs and demands of their teams would be fully addressed.

As a result of their thorough, upfront research, GDT’s CX team knew that Cisco Webex, its on-demand collaboration, online meeting, web conferencing and videoconferencing solution, would deliver the perfect solution. GDT’s CX team knew that its close Cisco partnership, which spans more than two (2) decades, would do exactly what it has done for thousands of customers over the years—perfectly address customers’ unique technology needs.

The GDT CX team crafted a detailed Action Plan that addressed everything needed for the solution to be considered successful—detailed training and communications plans, measurement of key performance indicators (KPIs), tracking of employees’ usage, barriers (and how to overcome them) regarding adoption, and all business processes that may be affected by GDT’s Webex solution.

“The purpose of today’s training is to defeat yesterday’s understanding.”

The effectiveness of any solution relies heavily on knowledge sharing and training, and GDT’s CX professionals put together a plan to not only address both, but one (1) that would include information that can stand in the way of the best of solutions—the fear of change. GDT made available multiple training sessions to accommodate the customer’s workforce, which included an array of work shifts, and also included Lunch and Learns at their corporate headquarters.

“Success is led by the power of communication.”

When it comes to putting action and training plans to the test, nobody knows better than GDT’s CX team that communication is the linchpin of a solution’s success. Without it, adoption rates are low, metrics aren’t met, the fear of change is validated, and desired business outcomes are quickly extinguished.

GDT’s CX team crafted a communication plan that addressed each of the customer’s concerns. They soon learned, as GDT had promised, that they would gain many advantages by migrating to a cloud-based interface. Team spaces in Cisco Webex were created to immediately respond to employee concerns that may affect adoption rates. And training didn’t take a backseat once the solution was deployed. Through virtual desktops, Webex teams were deployed for multiple departments.

Cisco Enterprise Agreement (EA)—a simple solution to a complex, decades-old problem

Prior to the 2017 launch of Cisco Enterprise Agreement, three (3) key elements software customers had always wanted, but never received, including simplicity related to license management, flexibility to meet the changing demands of businesses, and value that comes from financial predictability and the end to retroactive fees. Cisco Enterprise Agreement delivers on all three (3). It helps organizations buy, consume and manage Cisco technology across its entire software portfolio. And it accomplishes something else—freeing up customers’ time, so their personnel can work on more pressing, business-driving initiatives.

“Oh, no, not the CAPTCHA screen”

By Richard Arneson

     ” If I’ve told you once, I’ve told you a thousand time. I’M NOT A ROBOT!”

Come on, admit it, when you’re trying to access a website and you get the I am not a robot CAPTCHA screen with the nine (9) stacked images, your heart drops a notch or twenty (20)—especially if you’re on your smart phone, each image is the size of a pencil eraser and you’ve misplaced your reading glasses. Is that a palm tree or a street light? And why did they hide it behind that stupid tree? It’s never a welcome site and Google, which offers its CAPTCHA service for free, has made proving you’re not a robot tougher. Hopefully, this news comes as relief if you’re getting stumped more frequently and are questioning your problem-solving skills.

Google, what gives? Just let me in the website

Remember the good ‘ole days when proving you weren’t a robot meant deciphering a few slightly swirled letters? But, do you also remember how the letters got more and more swirly, until determining the ones listed became a serious challenge?

The puzzle evolved because character recognition programs evolved, as well. They got better, and we’re all to blame. After years of correctly typing in letters, we helped train the recognition programs. By becoming more difficult, the puzzles became more annoying. New and different robot identification was needed, and Google found it in CAPTCHA (Completely Automated Public Turing test to tell Computers and Humans Apart), which they bought from Carnegie Mellon in 2009.

We can thank this big brain for making CAPTCHA more difficult

In 2016, a University of Illinois computer science professor named Jason Polakis published a paper in which he detailed how, by using off-the-shelf image recognition programs, he was able to solve CAPTCHA puzzles with seventy percent (70%) accuracy. Apparently, his paper made its way to Google. Soon after the publishing of the Polakis paper, CAPTCHA images became smaller, fuzzier and obscured by shrubs. Thanks, Jason, now I can’t read last night’s box score on my favorite website. His paper inspired other researchers, who began solving the CAPTCHA audio version with Google’s own audio recognition program.  

According to Polakis, “We’re at a point where making it harder for software ends up making it too hard for many people. We need some alternative, but there’s not a concrete plan yet.”

Failed attempts to supplant CAPTCHA

A lot of brainpower has attempted to replace CATCHA, but apparently nobody as “brilliant” as Polakis has tackled the issue. One (1) attempt involved asking users to determine facial expressions, ethnicity or gender. No, that wouldn’t result controversy.

Another big brain proposed trivia based on nursery rhymes—perfect, unless you want users to resent their parents for not reading to them at bedtime. Another CAPTCHA replacement still required picture identification, but in animated form. So, when the user is asked to identify, say, a camel, it will probably be dressed in a tux and smoking a cigarette.

reCAPTCHA v3—a very judgmental next version

Google’s CAPTCHA team has been working on reCAPTCHA v3, which the company introduced in late 2018. It uses adaptive risk analysis, which essentially scores traffic based on how suspicious it seems. They first determine what “good traffic” looks like, then uses that data to help detect the bad type. A website that has deemed a user unsavory, seedy or sketchy can present them with a challenge, such as a password request or two-factor authentication. Sounds pretty standard, right? That is, unless the website determines you’re a pillar of the digital community. You’ll soon be ushered in with the red carpet treatment.

Google hasn’t made it aware what “good traffic” looks like, which makes many wonder how traffic will be judged if a VPN or any anti-tracking extensions are being used.

Contact these pro’s if you’re looking to captcha network security for your organization

To find out how to shore up your organization’s security posture, contact GDT’s tenured and talented engineers and security analysts at SOC@GDT.com. From their Security and Network Operations Centers, they manage, monitor and protect the networks of organizations of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

If you want more information about network security, read about it here:

If you’re storing data down under, you’re likely re-thinking that decision, says Microsoft president

What’s left when a supply-chain reliant corporation gets hacked? Paperwork

Introducing your cyber threat starting lineup

Death and Taxes—and you can add this to the mix

If you doubled down on Russia, your bet’s safe

What happens in an ATM, doesn’t always stay in an ATM

Google launches itself into cybersecurity space

Getting Stuffed at Dunkin’ Donuts?

Security Myths Debunked

State of the Union address focuses on technology–briefly

The technology arms race was just amped up

Apparently, cyber attackers also consider imitation to be the sincerest form of flattery

Last week’s DHS “alert” upgraded to “an emergency directive”

The Collection #1 data breach—sit down first; the numbers are pretty scary

Shutdown affects more than workers

DDoS Attacks will deny a Massachusetts Man Ten (10) years of Freedom

Phishing for Apples

This isn’t fake news

Don’t get blinded by binge-watching

Mo Money, Mo Technology―Taylor Swift uses facial recognition at concerts

Step aside all ye crimes—there’s a new king in town

Q & A for a Q & A website: Quora, what happened?

They were discovered on Google Play, but this is no game

And in this corner…

Elections are in, but there’s one (1) tally that remains to be counted

Hiring A Hacker Probably Shouldn’t Be Part of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

Interested in a sales career? Give this a look

By Richard Arneson

Technology and IT research leader Gartner predicted in its annual public cloud revenue forecast report that public cloud spend will almost double in the next three years (2022), from its current $182.4bn to $331.2bn. So, if you’re considering a sales career and are wondering what to sell, you may have found your answer.

Leading the pack

Gartner’s report broke out spend by segment, including IaaS (Infrastructure-as-a-Service), PaaS (Platform-as-a-Service), SaaS (Software-as-a-Service), BPaaS (Business Process-as-a-Service) and Cloud Management and Security Services.

The far and away leader in projected revenue growth is SaaS, which they also refer to as Cloud Application Services. Today, it generates almost half of public cloud spend ($80bn), and is expected to maintain roughly the same percentage growth when it reaches, according to Gartner, $143.7bn by 2022.

But the percentage growth leader is IaaS, and by a long shot. It’s expected to grow by over 250% in the next three (3) years, from its current $30.5bn to over $76bn. Then comes PaaS, presently at $15.8bn, which is predicted to be $31.8bn by 2022.

According to Sid Nag, a Gartner research vice president, “We [Gartner] know of no vendor or service provider today whose business model offerings and revenue growth are not influenced by the increasing adoption of cloud-first strategies in organizations. What we see now is only the beginning, though. Through 2022, Gartner projects the market size and growth of the cloud services industry at nearly three time the growth of overall IT services.”

Percentage of overall IT spend

Several recent Gartner surveys have revealed that almost a third of organizations see cloud spend as one (1) its three (3) top investing priorities. More traditional, non-cloud offerings, such as software and infrastructure, comprise the other seventy-two percent (72%). And Gartner predicts that by the end of 2019 (yes, this year) over thirty percent (30%) of service providers’ investments in new software will shift from license-based software consumption to a SaaS subscription-based model.

According to Michael Warrilow, another Gartner research vice president, “Cloud shift highlights the appeal of greater flexibility and agility, which is perceived as a benefit of on-demand capacity and pay-as-you-go pricing in cloud.”

What he doesn’t mention is where to type in “Public Cloud sales representative” on Monster.com.

Moving to the cloud? First, talk to these folks (you can thank me later)

Migrating to the cloud is a big move; it might be the biggest move of your IT career. If you don’t have the right cloud skill sets, expertise and experience on staff, you may soon be wondering if the cloud is all it’s cracked up to be.

That’s why turning to experienced Cloud experts like those at GDT can help make your cloud dreams a reality. They hold the highest cloud certifications in the industry and are experienced delivering and optimizing solutions from GDT’s key cloud partners―AWS, Microsoft Azure, Google Cloud and IBM Cloud. They can be reached at CloudTeam@gdt.com. They’d love to hear from you.

If you’d like to learn more about the cloud─migrating to it, things to consider prior to a migration, or a host of other cloud-related topics—you can read about them here:

Survey reveals organizations see the need to utilize more than one (1) public cloud service provider

Government Cloud adoption is growing, but at the rate of other industries?

The 6 (correctly spelled) R’s of a cloud migration

Are you Cloud Ready?

Calculating the costs–soft and hard–of a cloud migration

Migrating to the Cloud? Consider the following

And learn how GDT’s Cloud Team helped these organizations get the most out of their cloud deployments:

A utility company reaps the benefits of the cloud…finally

A company’s cloud goals were trumped by a poor architecture

Government Agency turns to GDT to migrate mission critical apps to the cloud

Monetize space? Sure, if you’ve got a launch pad and a few thousand satellites in your garage

By Richard Arneson

In the event you aren’t aware, there’s a 21st century version of the space race, and Amazon just officially entered it. The $700bn company just filed papers with the U.S. government to launch 3,236 satellites that will provide high speed internet service. They’ll launch them under the name Kuiper Systems, an Amazon subsidiary named after noted astronomer Gerard Kuiper, considered the father of modern planetary science (but you already knew that, right?).

According to Amazon, “Project Kuiper is a new initiative to launch a constellation of Low Earth Orbit satellites that will provide low-latency, high-speed broadband connectivity to unserved and underserved communities around the world.

“This is a long-term project that envisions serving tens of millions of people who lack basic access to broadband internet. We look forward to partnering on this initiative with companies that share this common vision.”

Actually, the number of people who don’t have Internet access is almost 4 billion, or over half the world’s population. And philanthropic proclamations aside, Amazon will certainly enjoy what other ISPs (Internet Service Providers) do—high profits, especially in the short term.

Neighbors in space

Amazon isn’t the first company, and definitely won’t be the last, with plans to monetize space and reap the rewards. In February, a company named OneWeb launched its first satellites after raising $3 billion from some big name investors, including Coca-Cola and Virgin.

And last December, a company named SpaceX launched a couple of satellite prototypes to help support its ambitious plans to shoot 11,000 satellites into orbit. They’re a little behind, though. Their initial goal was to have 400 launched by the end of 2018.

And what would a party be without Facebook making an appearance? They’re currently working on satellite capabilities through a subsidiary named PointView Tech, which is developing a satellite named Athena. Facebook claims it will deliver data ten times (10x) faster than SpaceX. Why SpaceX is in their crosshairs isn’t clear.

But why stop at Internet connectivity?

Apparently, Amazon really likes this space stuff. Another of its companies, Blue Origin, is currently working on launching payload-carrying vehicles into space. Seriously. In fact, its already signed contracts with Telesat, another company that wants to provide high-speed internet via orbiting satellites. The Blue Origin vehicles look a mushroom-shaped rocket from a 1950’s outer space film…but with a cooler paint job. To find out what they most resemble, you’ll have to go to www.blueorigin.com. Nuff said.

They don’t launch satellites, but they can definitely help digitally transform your organization

If you’d like to learn about how GDT’s design engineers and solutions architects turn traditional, legacy infrastructures into innovative, agile machines that make customers more competitive, help them bring applications to market faster, and deliver a superior customer experience, contact them at SolutionsArchitects@gdt.com or at Engineering@gdt.com. They’d love to hear from you.

You can read more about how to digitally transform your infrastructure here:

FTC cracks down on robocalls, but is apparently poor at collection calls

How Machine Learning is making you smarter on game day

Nvidia drops big chunk of change to round out its data center strategy

Intelligence limited only by your architectural imagination

Data growth— how to store all those 1’s and 0’s

Storage—software style

Thank God I’m a country boy…as long as I get broadband

A road less traveled…than you’d think

The four (4) horsemen of HCI

Who doesn’t want to Modernize?

Workshops uncover insights into the state of IT and Digital Transformation

What is Digital Transformation?

The only thing we have to fear is…definitely not automation

Without application performance monitoring, your IoT goals may be MIA

When implementing a new technology, don’t forget this word

Automation and Autonomics—the difference is more than just a few letters

Is Blockchain a lie detector for the digital age?

If you fall victim to it, you won’t end up marking it as “like”

They were discovered on Google Play, but this is no game

Blockchain; it’s more than just Bitcoin

When being disruptive is a good thing

“REPEATABLE” IS THE OPERATIVE WORD

GDT Manufacturing and the GDT Project Management team craft a repeatable deployment for hundreds of retail locations

By Richard Arneson

With almost three hundred (300) retail locations in the United States, a global manufacturing corporation serving the business and residential markets needed to greatly expand its wireless network. If that sounds like a straightforward project, it’s anything but when the locations span four (4) time zones and must be completed in four (4) months. And to add to the complexity, it required utilizing contractors—in this case over two hundred (200)—from coast to coast. The project needed a network and systems integrator capable of managing large, complex projects end-to-end. It needed GDT.

Here’s what happens when GDT’s Project Management professionals and GDT Manufacturing join forces

In the IT industry, a project that is initially poorly scoped is often how many start. This project was no different. However, unlike many project managers, GDT’s Project Management practice is familiar with the realities of the industry. That original scope of work is the first thing they carefully analyze. With tens of thousands of successful project implementations and deployments under their belt, they know that understanding exactly what’s needed to ensure a project’s success must come first. It’s about matching the right projects with the right professionals who can deliver the perfect solution accurately and on-time (in this particular case, on-time actually meant a month ahead of schedule).

Here was one (1) of the major issues—each location, which had up to four (4) access points, required customized cabling at each. When multiplied by almost three hundred (300) locations, that’s well over a thousand (1,000) custom-tailored cable assemblies. Purchasing this amount of Cat6 Plenum cable from manufacturers usually results in wastes of time, resources and cable. That is, unless GDT Manufacturing is involved.

GDT Manufacturing—its customized solutions continually set it apart from the others

Located less than a half mile from GDT’s Innovation Campus, the 40,000 sq. ft. GDT Manufacturing production plant is a state-of-the-art facility where over a hundred highly trained professionals manufacture and test customized Layer One (1) assemblies.

This story follows a script that has been replicated hundreds of times—Project Management contacts GDT Manufacturing. From customer-provided blueprints, GDT Manufacturing not only ensured cable assemblies perfectly accommodated each location, but that they included required straps and industry standard-labeling that would enable contractors to more easily install and implement the solution.

When it all comes together

When disparate time zones and unique floor plans are combined with over two hundred (200) contractors, project managers who consistently bring their A game to the table is the difference between success and failure.

With a goal of completing dozens of locations each week, GDT Project Management deployed an online portal from which project status and communications could be monitored and exchanged. Contractors uploaded photographs throughout the installation process, which were closely monitored by a GDT Quality Auditor to ensure all implementations resulted in a cohesive, structured framework. The portal’s easy-to-use, intuitive dashboard needed to easily accommodate hundreds of contractors with varying levels of tenure and expertise—it did. And it also allowed GDT Manufacturing to better facilitate shipping and help simplify invoicing.

Only after the customer’s engineering team had fully tested each access point—and GDT’s Quality Auditor had inspected all work—was the contractor released from the project.

“Repeatable” is key when deploying large-scale solutions

One (1) box per location. One (1) labeling standard. One (1) industry-leading project management practice. One (1) manufacturing organization that continually delivers what others don’t. When they all came together, they made one (1) customer very happy. In fact, they’re currently working with GDT to deliver the same solution to another three hundred (300) retail locations.

Good news, bad news for IoT

By Richard Arneson

How could there possibly be any news—other than the good kind—for IoT? It will inarguably be looked back on as the most impactful technological paradigm, per capita, of all time. More so than the personal computer. The number of IoT devices has surpassed the world’s population. The PC can’t claim a quarter of that.

You can connect cars, wearables, toys, TVs, meters of all types, security systems, monitoring systems for traffic, both auto and foot, and weather. There are smart cities, smart thermostats, smart lighting, smart appliances, etc. And for anything not connected, there are no doubt people tirelessly working on figuring it out how to make it smart and controllable via a tablet or phone.

But, sadly, with the good comes always comes some bad. And in the case of IoT, the bad involves security. According to a recent study, the number of cyber attacks directed at IoT devices doubled in 2018.

IoT attacks were rare prior to 2014, and the most noteworthy attack came two (2) years later in 2016 when Mirai was introduced. It is the most prevalent family of malware attacks. It’s the granddaddy of IoT attacks, and made its name on those of the DDoS variety. Many subsequent threats have been offshoots of it.

Patience, the good news is coming

Once Mirai made the scene, a steady stream of IoT-targeted attacks soon launched.

Hajime was launched a couple months after Mirai and took advantage of users who neglected to change default passwords on the routers supplied by ISPs.

IoT Reaper was introduced in late 2017, which didn’t rely on password negligence, but targeted HTTP control interface vulnerabilities in publicly facing IT and closed-circuit television cameras (CCTVs). It was a biggie, infecting millions of devices.

The adorably named Hide N Seek virus road the coattails of IoT Reaper. It also found cameras, but accessed the servers by randomly generating IP addresses. It crypto jacked infected servers, installing crypto miners to steal compute resources and generate virtual currency.

ADB.Miner was the first variant of Mirai, and the first to target Android devices. It used devices’ debugger interface to install a crypto miner to trade a virtual currency named Monero.

Fbot, which was also inspired by Mirai, included blockchain-based DNS that was difficult to track. Fbot first targeted ADB.Miner, uninstalled it, then used the infected Android device to crypto mine.

Torii, another Mirai variant, is a brute force attack and used exit nodes by utilizing software from Tor (The Onion Router), a free, open-source software for anonymous communications.

Last year, VPNFilter, the first government-led IoT cyber-attack, was detected. It was backed by the Russians and targeted routers used in the Ukraine, destroying its firmware and sniffing out weak credentials. Almost every router manufacturer on the market had some vulnerability that could be exploited in the attack.

Finally, the good stuff

The study found that almost ninety percent (90%) of all threats examined, including the aforementioned, could have been combated by either deploying a strong password or updating device software. Doing both will take care of ninety percent (90%) of your IoT security concerns. Easy, simple and secure—a no-brainer.

Also, device manufacturers are getting better at releasing products with security in mind. Up until recently, they had placed little emphasis on it. That’s good news moving forward, but not so much if you’re using a device that’s getting a little long in the tooth.

Even better news—you can call on these IoT and Smart City experts to soothe your fears and keep your IoT roadmap safe, sound and secure

For more information about IoT and Smart City solutions, and how to deploy them with security top-of-mind, talk to the experts at GDT. Their tenured, talented solutions architects, engineers and security analysts understand how to design and deploy IoT and Smart City solutions for organizations of all sizes to help them realize more productivity, enhanced operations and greater revenue. GDT helps organizations transform their legacy environments into highly productive digital infrastructures and architectures. You can reach them at IoT@gdt.com. They’d love to hear from you.

You can read more about Smart Cities and IoT Solutions here:

What’s in store for IoT in 2019?

Smart Sneaks

These are no dim bulbs

Why Smart Cities? It’s in the numbers

Five (5) things to consider prior to your company’s IoT journey

Without Application Performance Monitoring, your IoT goals may be MIA

How does IoT fit with SD-WAN?

GDT is leading the Smart Cities Revolution

FTC cracks down on robocalls, but is apparently poor at collection calls

By Richard Arneson

You wouldn’t want to receive one even if you’d spent the last ten (10) years in solitary confinement at Devil’s Island. It’s the robocall, and in all likelihood, you’re getting anywhere from 20 to several gillion a day. In the United States alone, over 4.1 billion robocalls were made in February. Over half the world’s population could have been called in February, the shortest month of the year, no less.

No, they’re not annoying at all. That is, unless you have something—anything, like a 3-hour wait in line at the DMV—that’s a better use of your time than being told you’ve qualified for a trip to the Bahamas, the lowest mortgage rate ever or are subject to an IRS audit in the event you don’t return the call immediately. But the good news is that the Federal Trade Commission (FTC) finds them annoying, too. So much so that they’ve shut down four (4) of the more active robocall companies and fined them anywhere from $500,000 to $3.64 million.

Here are the offenders, listed in order of sleaziness

These folks are real charmers. Calls from Veterans of America claimed they were an an altruistic organization that accepted donations in the form of cars, boats, motorcycles, anything of value, that would be provided to veterans. Instead, they sold the items for profit. It was operated by Travis Deloy Peterson, who faces a $500,000 fine. Yes, Travis got off easy.

If you’ve gotten a robocall from, allegedly, Google, you were probably at the wrong end of a dialer owned by Pointbreak Media. Their calls claimed to be coming from Google in an attempt to get smaller businesses to purchase a service that would improve how they rank in Google searches. They got the biggest fine—$3.64 million.

Higher Goals Marketing hit up anybody who picked up their robocall for services to lower credit card interest rates. The owner’s previous company, Life Management Services, was shut down three (3) years ago. They now owe $3.15 million in fines.

NetDotSolutions is the beast of the bunch and made robocalls on security systems, credit cards, debt relief, warranties, loans, you name it. They ignored the Do Not Call registry, left unlawful messages and used spoof caller IDs. They got hit with a $1.3 million fine.

But can the FTC collect these fines?

The fine amounts sound impressive, but whether the FTC collects them is another matter. According to The Wall Street Journal, which obtained the information courtesy of the Freedom of Information Act, of the $208.4 million in fines the FTC has levied against robocallers since 2015, they’ve collected $6,790, just enough to buy a high-end bicycle.

Technology questions? Turn to the pros

If you’d like to learn about how GDT’s design engineers and solutions architects turn traditional, legacy infrastructures into innovative, agile machines that make customers more competitive, help them bring applications to market faster, and deliver a superior customer experience, contact them at SolutionsArchitects@gdt.com or at Engineering@gdt.com. They’d love to hear from you.

You can read more about how to digitally transform your infrastructure here:

How Machine Learning is making you smarter on game day

Nvidia drops big chunk of change to round out its data center strategy

Intelligence limited only by your architectural imagination

Data growth— how to store all those 1’s and 0’s

Storage—software style

Thank God I’m a country boy…as long as I get broadband

A road less traveled…than you’d think

The four (4) horsemen of HCI

Who doesn’t want to Modernize?

Workshops uncover insights into the state of IT and Digital Transformation

What is Digital Transformation?

The only thing we have to fear is…definitely not automation

Without application performance monitoring, your IoT goals may be MIA

When implementing a new technology, don’t forget this word

Automation and Autonomics—the difference is more than just a few letters

Is Blockchain a lie detector for the digital age?

If you fall victim to it, you won’t end up marking it as “like”

They were discovered on Google Play, but this is no game

Blockchain; it’s more than just Bitcoin

When being disruptive is a good thing

If you’re storing data Down Under, you’re likely re-thinking that decision, says Microsoft president

By Richard Arneson

Yesterday, Microsoft President and Chief Legal Officer Brad Smith said many of its government and enterprise customers want to build data centers outside of Australia. He said that they’re very concerned with Australian legislation passed last December that may leave their data ripe—at least more ripe—for the cybersecurity picking.

Combined, Labor and Coalition parties may have succeeded in leaving the backdoor open

The legislation, titled The Telecommunications (Assistance and Access) Act, basically allows Australian law enforcement to, as it sees fit, hack, implant malware and be given backdoor access to companies, including the biggies, such as Facebook, Google and Apple. The goals of the act, which was fervently opposed by the conservative Labor Party, were admirable—find and catch digital evildoers. Who can argue with that? But the Labor Party was afraid of giving the government that level of free rein, so they agreed to a compromise that limited its powers only to investigate terrorism, child sexual offences, or any offense that would bring about a term of at least three (3) years in prison. If it sounds like those “limits” could be stretched due to ambiguity, it’s because they can. By all accounts, the legislation is vaguely worded.

Many in the Labor Party are already expressing regret that they agreed to the legislation in the first place. Yesterday, Labor spokesman Ed Husic told a technology forum in Sydney that he wished he could turn back time. They reportedly agreed to the restructured bill because they feared that blocking it would ultimately result in the Labor Party being blamed for a terrorist attack that was suspected to take place around Christmas. It never came.

Just last week, Labor claimed Coalition is already reneging on the agreement by not supporting amendments previously published in a bipartisan security report.

Broad security measures

The list of measures the Australian government can utilize is long and as broad as the Mighty Mississippi. It includes, among dozens of other elements, its ability to remove an organization’s form(s) of electronic protection, facilitate access to its services and equipment, install or update security software, modify technology, and be able to conceal that any of the aforementioned measures have taken place.

Several concerns by several players

Australia’s Communications Alliance, which is the country’s primary lobbying group for the technology sector, fears the law will take a chunk out of the country’s $3.2bn technology export business. Due to it, they claim, companies and countries will restrict Australian imports because of concerns that Australian devices will be more vulnerable.

Australia’s Human Rights Commission is concerned the law could result in suspects being tricked into providing access to encrypted messages. For instance, an email to an individual or entity instructing them to update an application could ultimately provide the police or a government agency access to users’ devices.

The Labor Party’s Scott Ryan, the current Senate president, is afraid it will allow agencies access to devices utilized by members of Australia’s Parliament. So, parliamentarians would lose the opportunity to claim parliamentary privilege concerning material seized under warrant. File this under the heading CYA.

Smith acknowledges that the law certainly wasn’t written to push open backdoors and create vulnerabilities, but he’s hearing many companies and governments claim that they’ll no longer put their data in Australia. “So,” he says, “they [customers] are asking us to build more data centers in other countries.”

Forget the politics; secure your network

To find out how to shore up your organization’s security posture, contact GDT’s tenured and talented engineers and security analysts at SOC@GDT.com. From their Security and Network Operations Centers, they manage, monitor and protect the networks of organizations of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

If you want more information about network security, read about it here:

What’s left when a supply-chain reliant corporation gets hacked? Paperwork

Introducing your cyber threat starting lineup

Death and Taxes—and you can add this to the mix

If you doubled down on Russia, your bet’s safe

What happens in an ATM, doesn’t always stay in an ATM

Google launches itself into cybersecurity space

Getting Stuffed at Dunkin’ Donuts?

Security Myths Debunked

State of the Union address focuses on technology–briefly

The technology arms race was just amped up

Apparently, cyber attackers also consider imitation to be the sincerest form of flattery

Last week’s DHS “alert” upgraded to “an emergency directive”

The Collection #1 data breach—sit down first; the numbers are pretty scary

Shutdown affects more than workers

DDoS Attacks will deny a Massachusetts Man Ten (10) years of Freedom

Phishing for Apples

This isn’t fake news

Don’t get blinded by binge-watching

Mo Money, Mo Technology―Taylor Swift uses facial recognition at concerts

Step aside all ye crimes—there’s a new king in town

Q & A for a Q & A website: Quora, what happened?

They were discovered on Google Play, but this is no game

And in this corner…

Elections are in, but there’s one (1) tally that remains to be counted

Hiring A Hacker Probably Shouldn’t Be Part of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

“Pure and simple.”

CIO’s quote perfectly describes its GDT and Pure Storage solution  

By Richard Arneson

“Our life is frittered away by details. Simplify, simplify,” noted American author Henry David Thoreau over two hundred fifty (250) years ago. But simple is a word that’s rarely used to describe an IT deployment of any magnitude. And prior to working with GDT, this CIO would have likely agreed. But that all changed when his rapidly growing organization turned to GDT to replace a storage solution it had previously deployed from one (1) of the IT industry’s largest OEMs.

The client, a large oil and gas company with holdings throughout the southwestern United States, needed simplicity when it came to its storage solutions—they had never enjoyed it. Their comparatively small IT team was stretched at the seams trying to keep up with the company’s precipitous growth. And in the oil and gas industry, connectivity often takes on an added challenge—many remote locations can be the most far-flung of locations.

The Challenge

The client had been utilizing a storage solution from one (1) of the industry’s most noted equipment manufacturers. “It worked well for a long time,” said the client’s CIO, “but when scalability was needed, we ran into problems.” The problems came in the form of needed upgrades, which were anything but simple. In fact, its first—and last—capacity upgrade took over forty-eight (48) hours to complete and resulted in its network downed for almost half of that time. In addition, firmware upgrades required the OEM to utilize the customer’s computer via a WebEx—neither simple nor efficient.

The Solution

As a ten-year-old, California-based company whose sole focus is on developing premier storage solutions for the IT industry, Pure Storage, which has a rich, long-standing relationship with GDT, had a solution that would perfectly address the client’s needs. They knew its FlashArray storage solutions could perfectly deliver what the customer needed, but wasn’t enjoying—simplicity and scalability. They were working with the right company—Pure Storage created the AFA (All-Flash Array) storage category.

Solutions architects and engineers from GDT and Pure Storage collaborated to design a storage solution that incorporated the Pure Storage FlashArray//X70 at headquarters, which provides up to 1.3 petabytes (PB) of storage. It is the first all-flash, 100% NVMe (Non-Volatile Memory Express) storage solution that is designed for all apps, whether mainstream enterprise or next-gen web-scale. 

After analyzing the unique demands at each of the customer’s remote locations, two (2) Pure Storage solutions were utilized—the FlashArray//X10 and //X20, which provide up to 55 PB and 275 PB, respectively. With latency as low as 250 microseconds, its FlashArray//X solutions bring new levels of performance to mission-critical business applications.

And the solution allowed all customer data to be natively replicated back to headquarters.

How’s this for simplicity and scalability?

The customer’s upgrades have been reduced from forty-eight (48) hours previously, to its current time—ten (10) minutes with GDT and Pure Storage. Incredible, but not news to the hundreds of customers who for years have enjoyed the GDT and Pure Storage partnership.

What’s left when a supply chain-reliant corporation gets hacked? Paperwork

By Richard Arneson

 At least they have skating and skiing to fall back on

One of the world’s largest producers of aluminum is currently operating the old-fashioned way—manually. Late Monday night, Hydro, an Oslo, Norway-based aluminum manufacturing company fell victim to a LockerGoga ransomware attack. It’s the same one (1) that attempted to extort money from a French engineering firm in January. LockerGoga distributes the ransomware by using victims’ own Active Directory against itself.

Here’s where it all got started

The ransomware was launched at one (1) of the Hydros U.S. plants, then rapidly spread throughout its global operations. As Norway’s second largest company, and with over 35,000 employees in fifty (50) different countries, Hyrdros staffers had to turn back the clock and rely on manual processes to manage orders and shipments, and monitor smelters scattered throughout the world. Yes, they put pen to paper.

Several automated product lines and extrusion plants were shut down, which resulted in customers from several industries waiting at the curb for their aluminum shipment to arrive. Several automakers were unable to manufacture aluminum automotive components.

Kripos, which is Norway’s National Criminal Investigation Service, learned about the attack Tuesday morning through the country’s Joint Cyber Coordination Center. Kripos has been liaising with Europol, the EU’s law enforcement intelligence agency, even though Norway isn’t a part of the EU. However, their 2001 agreement with the EU enables it to work with Europol.

How LockerGoga works

Thanks to LockerGoga, Hydros’ own Active Directory basically attacked itself. It was a three-pronged approach:  obtain domain credentials, identify targets by querying Active Directory, which maintains information about users, apps, servers, endpoints, et al., then move through the network and self-propagate itself until the entire company is locked down.

The Hydros website was down for over twenty-four (24) hours, and it’s still uncertain how long it will take to restore stable IT operations. Most of its IT systems have been down at some point since Monday night, but Hydros reports that power plants operating on IT systems unaffected by the attack are running normally. Since the attack, Hydros has been using Facebook for its primary means of communicating. And the hackers didn’t receive a dime of ransom. Hydros did what many do—rely on its backups.

Why pick on Norway?

Cyber threat-wise, Norway has had a rough few months. Unless hackers resent the fact that the country of just over five (5) million runs away with most of the gold medal at each Winter Olympics, it just happens to be its time in the cyber security nine (9) circles of hell. And it comes at a bad time for Hydros; over a year ago, its plant in Brazil had to shut down amid claims that a spill was destroying the environment. It resulted in its shares falling almost forty percent (40%). In early trading Tuesday morning, its shares dipped another 3.4%.

In February, it was uncovered by cyber security investigators that hackers working with the Chinese government had breached the network of Norwegian software firm Visma. They wanted to steal intellectual property and client information. The attack, known as Cloudhopper, targets software providers and technology services firms.

Lesson to be learned?

Educate your employees. While they’ve yet to confirm it, Hydros suspects that one (1) of its 35,000 employees was successfully phished. Somebody opened a strange, but apparently tempting, email. That’s all it takes.

Don’t play a game of chance with your IT security

To find out how to shore up your organization’s security posture, contact GDT’s tenured and talented engineers and security analysts at SOC@GDT.com. From its Security and Network Operations Centers, they manage, monitor and protect the networks of organizations of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

If you want more information about network security, read more about it here:

Introducing your cyber threat starting lineup

Death and Taxes—and you can add this to the mix

If you doubled down on Russia, your bet’s safe

What happens in an ATM, doesn’t always stay in an ATM

Google launches itself into cybersecurity space

Getting Stuffed at Dunkin’ Donuts?

Security Myths Debunked

State of the Union address focuses on technology–briefly

The technology arms race was just amped up

Apparently, cyber attackers also consider imitation to be the sincerest form of flattery

Last week’s DHS “alert” upgraded to “an emergency directive”

The Collection #1 data breach—sit down first; the numbers are pretty scary

Shutdown affects more than workers

DDoS Attacks will deny a Massachusetts Man Ten (10) years of Freedom

Phishing for Apples

This isn’t fake news

Don’t get blinded by binge-watching

Mo Money, Mo Technology―Taylor Swift uses facial recognition at concerts

Step aside all ye crimes—there’s a new king in town

Q & A for a Q & A website: Quora, what happened?

They were discovered on Google Play, but this is no game

And in this corner…

Elections are in, but there’s one (1) tally that remains to be counted

Hiring A Hacker Probably Shouldn’t Be Part of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

How Machine Learning is making you smarter on game day

By Richard Arneson

As a sports fans, we’ve gotten very, very spoiled. Getting somebody under 30 years of age to imagine life without ESPN broadcasting sports 24x7x365 is like pondering life without smart phones. For a fun point of reference, take a look at a TV broadcast of a football game from the sixties; you can actually see the mylar sheet on which the score is printed waving slightly. Now, consider what sports fans are enjoying just fifty (50) years later.

We live in the Madden NFL era, where video games not only allow us to play armchair quarterback, but get into the action without the torn ACLs or brain trauma. And those rudimentary stats provided by unsophisticated box scores in the morning’s sports page are now provided instantaneously while you’re watching. And they’re anything but rudimentary.

Yes, it’s a great time to be a sports fan, and much of the phenomenal experience is provided to you from the world of machine learning.

NextGen Stats and AWS

In an attempt to buoy its brand and attempt to grow viewership, which has been on a slow, steady decline for a few years, the NFL is searching for new, cutting-edge ways to get fan in seats, both at the stadium and in front of man cave Flat Screens.

Prior to last season, the NFL followed Major League Baseball’s lead and selected Amazon Web Services (AWS) to become its official technology partner. Like MLB’s Statcast, AWS cloud and its machine learning technology deliver NextGen Stats, which is a sports statistics program developed by Zebra Technologies.

Real-time location data, including speed and acceleration for each player, is gathered from sensors scattered about stadiums that track tags affixed to players’ shoulder pads. Algorithms collect the data and integrate it fast enough to be used for live broadcasts. It allows for the tracking of each player’s movements down to the inch. But, more importantly, that information is passed on to viewers. In addition, chips are inserted into footballs, so viewers and team management can more easily determine if a punter needs to work on hang time or a QB needs to perfect his mechanics throwing particular routes.

The data is analyzed on AWS and provides stats unique to each player, including, for instance, how well a wide receiver is at getting open or how effective the offensive line is at protecting their quarterback. And it’s not just for the fans; the teams are enjoying the insights that AWS cloud-based machine learning can provide to their organization.

NextGen Stats attempts to bring to live football games what the Madden NFL Video game has for years—ways to better visualize the action on the field and gain deeper insights. And, from that, maybe help you win your fantasy football league championship.

It’s been around longer than you’d think

The NFL actually began tracking player movement through RFID tags with Zebra Technologies over five (5) years ago. It was unpublicized, perhaps intentionally, because it wasn’t used to enhance the fan experience. At that time, only teams utilized the technology, and were only given information about their own organization. They primarily used the information to track and monitor players’ recovery processes. For example, teams would try and identify whether players were overexerting themselves during pre-game workups by monitoring the times it took them to run pass routes during the game.

Wondering how AI and machine learning can benefit your organization?

If you’d like to learn about how GDT’s design engineers and solutions architects turn traditional, legacy infrastructures into innovative, agile machines that make customers more competitive, help them bring applications to market faster, and deliver a superior customer experience, contact them at SolutionsArchitects@gdt.com or Engineering@gdt.com. They’ve worked closely with trusted partner AWS to deliver the many benefits the cloud provides to organizations of all sizes and from a variety of industries. They’d love to hear from you.

You can read more about how to digitally transform your infrastructure here:

Nvidia drops big chunk of change to round out its data center strategy

Intelligence limited only by your architectural imagination

Data growth— how to store all those 1’s and 0’s

Storage—software style

Thank God I’m a country boy…as long as I get broadband

A road less traveled…than you’d think

The four (4) horsemen of HCI

Who doesn’t want to Modernize?

Workshops uncover insights into the state of IT and Digital Transformation

What is Digital Transformation?

The only thing we have to fear is…definitely not automation

Without application performance monitoring, your IoT goals may be MIA

When implementing a new technology, don’t forget this word

Automation and Autonomics—the difference is more than just a few letters

Is Blockchain a lie detector for the digital age?

If you fall victim to it, you won’t end up marking it as “like”

They were discovered on Google Play, but this is no game

Blockchain; it’s more than just Bitcoin

When being disruptive is a good thing

Introducing your cyberthreat starting lineup

By Richard Arneson

It’s March. The lion is slowly morphing into a lamb, with warmer temps and sun screen to follow. March Madness is days away. Almost seventy (70) college games will soon test your flat screen’s durability. But you don’t have to wait for any brackets to find out who’s starting in this particular tournament, which lasts far longer than a fortnight. It’s an ongoing battle of the boards that takes place each second of every single day. It’s the ongoing fight to secure networks and keep vital data out of the hands of the following players.

Please direct your attention to center court. It’s time to introduce the Cyberthreat starting lineup.

At point guard, a veteran at unknowingly risking the security posture of virtually every business, organization and government on the planet—users.

As you’ve probably heard countless times, user error is the largest threat actor in the cyberthreat starting lineup. Whether it’s IT departments lacking the needed security skill sets to fend off attackers, too many unnecessary privileges being granted, or somebody absentmindedly clicking on a link in an email, internal errors are killers.

But users don’t always act unknowingly. Consider the disgruntled current or ex-employee. If they don’t adhere to the “never burn your bridges” workplace philosophy, they may just want a pound of data flesh. And they’ve even been known to collaborate with organized crime—even governments—to gain information or a big cash payout.

In late 2018, a scientist at biotechnology firm Genentech sold trade secrets to a rival company, which allowed them to manufacture generic versions of Genentech pharmaceuticals.

At the off-guard, an angry-at-the-world, politically-driven menace—the hacktavist.

These ne’er-do-wells are politically motivated so, naturally, making the most noise possible is a core motivator. Whether it’s publicly making a statement about their cause du jour or attacking a business or organization they feel has wronged them or the public at large, hacktavists have a delusional belief that they’re lauded by many. Hactavists attacked extramarital dating site Ashley Madison and divulged that names of tens of millions of members.

At small forward, the well-funded and cyber sophisticated—government-sponsored cyberthreat.

Government-led cyberthreats can count as their motivation a broad list of reasons, from economic, military, political…you name it. A year ago, the U.S. and the U.K. issued a joint statement blaming Russia for a series of cyberattacks. The Department of Justice a few weeks ago “shot down” a North Korean launched botnet. A Norwegian software company revealed that hackers form China’s ministry of State Security attempted to steal clients’ trade secrets. It was discovered that Iran had for years launched global DNS hijacking attacks against the Middle East, Europe, and North America. The Mexican government used spyware to target colleagues of a slain journalist investigating drug cartels. Six (6) months ago, different governments from at least forty-five (45) countries deployed spyware against targets in the U.S., France, Canada, and the UK.

That is a miniscule number of examples of government-sponsored cyber attacks. The list is exhausting. Cyber Warfare is the new battleground.

At power forward, and borrowing from a menacing label that dates back decades—organized crime.

Organized crime, whether you’re talking cyber threats or Capone-era Chicago, ultimately exists for a single purpose—illegal profits. The former types are the ones trying to get your logins and passwords, social security numbers, credit card information and health records. They’re the launchers of ransomware, bots and trojans. They’ve lately turned more and more to credential stuffing. And when a better mousetrap is built to stop them, they build a better, smarter mouse.

At center, a starter, but a less publicized or feared cybercriminal—the script kiddie.

These are the amateurs, usually working alone with a bag of chips and a Mountain Dew at their side, who use existing code they’ve found on the dark web to launch their attacks. They don’t develop their own tools; they’re wannabes and generally don’t do extensive damage, but want to prank websites for grins. However, there have been a few noteworthy attacks, like a DDoS event that crippled Yahoo a few years back.

A cybercrime-fighting team that’s been winning for years

To find out how to shore up your organization’s security posture, contact GDT’s tenured and talented engineers and security analysts at SOC@GDT.com. From their Security and Network Operations Centers, they manage, monitor and protect the networks of organizations of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

If you want more information about network security, cyberattacks and how to combat the cyberthreat starting lineup, read more about it here:

Death and Taxes—and you can add this to the mix

If you doubled down on Russia, your bet’s safe

What happens in an ATM, doesn’t always stay in an ATM

Google launches itself into cybersecurity space

Getting Stuffed at Dunkin’ Donuts?

Security Myths Debunked

State of the Union address focuses on technology–briefly

The technology arms race was just amped up

Apparently, cyber attackers also consider imitation to be the sincerest form of flattery

Last week’s DHS “alert” upgraded to “an emergency directive”

The Collection #1 data breach—sit down first; the numbers are pretty scary

Shutdown affects more than workers

DDoS Attacks will deny a Massachusetts Man Ten (10) years of Freedom

Phishing for Apples

This isn’t fake news

Don’t get blinded by binge-watching

Mo Money, Mo Technology―Taylor Swift uses facial recognition at concerts

Step aside all ye crimes—there’s a new king in town

Q & A for a Q & A website: Quora, what happened?

They were discovered on Google Play, but this is no game

And in this corner…

Elections are in, but there’s one (1) tally that remains to be counted

Hiring A Hacker Probably Shouldn’t Be Part of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

Nvidia drops big chunk of change to round out its data center strategy

By Richard Arneson

Today, Nvidia, the Santa Clara, CA-based technology company best known for popularizing the GPU (Graphics Processing Unit), announced that it will purchase Mellanox Technologies for a whisker under $7 billion. Nvidia will pay $125 per share for the provider of fast interconnect products, which is a fifteen percent (15%) premium over its closing price on Friday.

Nvidia outbid chipmaker rival Intel and sees the purchase of Mellanox and its InfiniBand interconnect technology as the perfect complement to its computing platform. Mellanox, which generated revenues of approximately $1 billion in 2018, pioneered InfiniBand (IB), which, along with its Ethernet products, are used in over half the world’s supercomputers and in many hyperscale data centers.

With the Mellanox purchase, Nvidia will be able to optimize data center-sized workloads across the entire networking, compute and storage stacks, resulting in greater utilization, higher performance and lower operating costs for customers. Both companies are counting on their performance-centric cultures to make integration seamless.

Definitely not hostile

Nvidia and Mellanox have been working together for years. In fact, they collaborated on building Sierra and Summit, the two (2) fastest computers in the world, both of which are utilized by the U.S. Department of Energy. In addition, most of the world’s top cloud providers use both Nvidia GPUs and Mellanox interconnects.

According to Nvidia CEO Jensen Huang, “The emergence of AI and data science, as well as billions of simultaneous computer users, is fueling skyrocketing demand on the world’s datacenters. Addressing this demand will require holistic architectures that connect vast numbers of fast computing nodes over intelligent networking fabrics to form a giant data center-scale compute engine.

“We’re excited to unite Nvidia’s accelerated computing platform with Mellanox’s world-renowned accelerated networking platform under one roof to create next-generation datacenter-scale computing solutions. I am particularly thrilled to work closely with the visionary leaders of Mellanox and their amazing people to invent the computers of tomorrow.”

According to Mellanox founder and CEO Eyal Waldman, “We share the same vision for accelerated computing as Nvidia. Combining our two companies comes as a natural extension of our longstanding partnership and is a great fit given our common performance-driven cultures. This combination will foster the creation of powerful technology and fantastic opportunities for our people.”

In the opening minutes of today’s trading, Nvidia’s shares were market 0.25% higher and traded at $151 each. Mellanox shares rose 8.8% to $119 each, a move that would increase their six-month gain over 55%.

Looking to transform and modernize your data center?

If you’d like to learn about how GDT’s design engineers and solutions architects turn traditional data centers into innovative, agile machines that make customers more competitive, help them bring applications to market faster, and deliver a superior customer experience, contact them at SolutionsArchitects@gdt.com or Engineering@gdt.com. They help customers enjoy an agile, service-oriented data center that’s highly automated and virtualized, and can easily scale up or down to perfectly accommodate their needs.

You can read more about how to digitally transform your infrastructure here:

Intelligence limited only by your architectural imagination

Data growth— how to store all those 1’s and 0’s

Storage—software style

Thank God I’m a country boy…as long as I get broadband

A road less traveled…than you’d think

The four (4) horsemen of HCI

Who doesn’t want to Modernize?

Workshops uncover insights into the state of IT and Digital Transformation

What is Digital Transformation?

The only thing we have to fear is…definitely not automation

Without application performance monitoring, your IoT goals may be MIA

When implementing a new technology, don’t forget this word

Automation and Autonomics—the difference is more than just a few letters

Is Blockchain a lie detector for the digital age?

If you fall victim to it, you won’t end up marking it as “like”

They were discovered on Google Play, but this is no game

Blockchain; it’s more than just Bitcoin

When being disruptive is a good thing

Here’s a way to speed up binge watching

By Richard Arneson

Five million HD movies. That’s how many can now be downloaded in a single second. That translates to 26.2 terabits of data each tick of the clock, which is the record recently set by Infinera, an optical communications equipment manufacturer based in Sunnyvale, California. What may be even more surprising is that it was set while speeding down a 4,104-mile, undersea optical cable stretched from Virginia Beach, VA. to the coastal town of Balboa, Spain.

The cable has a name—Marea. It’s owned by Microsoft, Facebook and Telxius, a Spanish telecommunications company. Marea is one (1) of an estimated three hundred seventy-eight (378) undersea cables crisscrossing the planet’s largest bodies of water. It’s about the width of a garden hose and houses eight (8) pairs of optical fibers. The entire assembly, all 4,104 miles of it, weighs over 10 million pounds. Try dragging that garden hose to your front yard on watering day.

Some other interesting Marea tidbits

  • It takes thirty-three (33) milliseconds for a photon of light to travel from Virginia Beach to Balboa. And if you think your Internet connection at home or at the office is humming along, average Internet speeds operate at approximately 0.00025 terabits per second. Pretty humbling.
  • The deepest ocean depth Marea reaches is 17,000 feet, or about 3.2 miles.
  • If HD movies are beneath you, you’ll have to settle on Marea delivering a mere 800,000 Ultra HD flicks.
  • Marea shattered the maximum data rate by about thirty percent (30%), or over 6 terabits per second. That mark could only handle a paltry 4 million HD movies. That’s like Usain Bolt knocking (3) seconds off his 100-meter record time of 9.58 seconds.

I’m glad you asked

The elephant in the room is how a maximum data rate can be exceeded, and by thirty percent (30%), no less. Doesn’t sound doable. It is.

Infinera implemented multiple wavelengths on a single optical chip. This allowed them to pack more wavelengths onto a single optical strand. Then, they transmitted each wavelength as a set of subcarriers, which allowed them to pack the fiber with even more wavelengths. The stuffed fiber, which was still only the thickness of a single strand of hair, allowed Infinera to claim the title.

For questions, turn to these optical networking experts

If you have questions or would like more information about fiber optics or optical networking, contact GDT’s Optical Networking practice professionals at Optical@gdt.com. Composed of experienced optical engineers, solutions architects and project managers who specialize in optical networks, the GDT Optical Networking team supports some of the largest service providers and enterprises in the world. They’d love to hear from you.

For additional reading about the greatness of fiber optics, check these out:

Just another day in the life of fiber optics

Don’t sell fiber optics short—what it can also deliver is pretty amazing

A fiber optic first

When good fiber goes bad

Busting myths about fiber optics

Just when you thought it couldn’t get any better 

And see how GDT’s talented and tenured Optical Networking team provided an optical solution, service and support for one (1) of the world’s largest social media companies:

When a redundancy plan needs…redundancy

This one’s no leak, and may not carry any more water

By Richard Arneson

It’s a rather un-Republication notion, but it’s resurfacing again after having first been mentioned, or rather leaked, in a White House email in early 2018—government involvement in the race to 5G.

This plan calls for taking wireless spectrum from the U.S. DoD (Department of Defense) and, through a 3rd party, making it available to wireless providers at a reduced rate. Many believe it’s a political move to curry favor with rural voters, who would get 5G faster with a little nudge from Uncle Sam.

According to Brad Parscale, Trump’s Campaign Manager for the 2020 election, “America must harness the power of capital markets and private sector to fund and build a state-of-art wholesale 5G network that is a model for the world. The government has underutilized spectrum it should share for the purpose. Americans deserve access to affordable wireless.”

FCC Commissioner Brendan Carr is no more enamored by this recent proposal than he was when the memo was leaked a year ago. He cites a lack of government involvement as a primary reason that the United States won the race to 4G. In it, he said, spectrum was freed up for commercial use and infrastructure rules were modernized. He sees no reason why the same won’t work for 5G. He fears that government involvement could limit competition and result in its control over the Internet. Carr believes that government-led spectrum wholesaling would result in wireless providers limiting their investment once spectrum becomes more commoditized. It’s not that he doesn’t think the government should play a role in 5G, but in partnership with providers and standards bodies roles, not in ownership of spectrum.

Wireless carriers, including AT&T, Verizon, T-Mobile and Sprint have yet to weigh in on this proposal, but they’ll likely align with their feelings from a year ago. They don’t like it and don’t want it. The leaked 5G plan from a year ago involved the U.S. government building a brand spankin’ new wireless network.

It’s a no-go, says former National Security Council official

Robert Spalding sees the benefits of government-led championing of 5G, not surprising considering he wrote the leaked memo while an official in the National Security Council. But as the current senior fellow at the Hudson Institute focused primarily on U.S.—China relations, he says it won’t happen, period, because doing so would require the military to share its airwaves.

“At the end of the day,” says Spalding, “An agency like the Department of Defense would have to step up and say this is absolutely required for national security. I know that DoD has no interest in using any kind of department resources in making this is a priority.”

Mobility questions? These folks have the answers

If you have questions about your organization’s current mobility strategy (or the one you’d like to implement) and how 5G will affect it, contact GDT’s Mobility Solutions experts at Mobility_Team@gdt.com. They’re comprised of experienced solutions architects and engineers who have implemented mobility solutions for some of the largest organizations in the world. They’d love to hear from you.

Intelligence limited only by your architectural imagination

By Richard Arneson

Imagine a world where you come home and toss your car keys, jacket, clothes, wallet, et al., into a bin that would smartly send each to where it belongs—closet, key hook, shoved under the bed, etc. The next morning, when you’re getting ready for the new day, everything is exactly where it should be. No more fishing around for items vying to make you ashamedly late for your 8 AM meeting. Sound too good to be true? Well, it is, at least at home. But in the IT world, not so much.

The IDC estimates that within the next six (6) years—by 2025—the amount of data will reach a hundred sixty-three (163) zettabytes. That’s about two thousand percent (2,000%) more than today’s tally. Organizations that know how to use their proliferation of data will be able to better develop new sources of revenue and service customers, and become more competitive. That’s where Intelligent storage steps in.

It’s more than intelligent; it’s a game changer

As its name suggest, intelligent storage is just that. Through Al and machine learning, intelligent storage learns—intelligently, of course—behaviors and adapts to its environment. The result is easier, better management of the mountains of data your organization produces daily. Intelligent storage helps you extract actionable insights, understand where data can be best positioned and provides recommended actions, or automatically makes them, while allowing your IT organization to focus on other projects or initiatives. It can be deployed as hardware on-prem, as a virtual appliance or as a cloud service.

The less-than-intelligent traditional storage system

Traditional storage systems place no priority on certain types of data, which means when it’s needed the time-consuming, cumbersome search begins. And depending on the data (see sensitive), it can be nerve rattling, as well. Intelligent storage removes these headaches; it understands workloads and what they need, adapts to changes and makes it easy to support and manage.

When data was stored primarily in datacenters, it did its job pretty well. But data has moved far outside the datacenter perimeter. Now it exists everywhere—in private and public clouds, and at the network edge in mobile devices, sensors, even vehicles. And that data needs to be moved to where it can be processed and easily accessible to the end users who need it, and need it now. Tackling today’s data demands with traditional storage is like taking a broken abacus into a CPA exam.

In the cloud, in the datacenter, at the edge—that’s Intelligent Storage

Two (2) years ago, The Economist made this now famous proclamation: “Data are to this century what oil was to the last one: a driver of growth and change.” It’s difficult to argue the point, except for one (1) small thing—the oil supply is limited; we’re slowly running out. Data is the exact opposite. Not only will it continue to grow, but it’ll do so exponentially. So, isn’t it important to know how to best store, transfer, access, analyze and secure it?

Make an intelligent decision about Intelligent Storage—give these folks a call

If you’d like to learn more about how intelligent storage can get your data where it needs to be, so it can be easily accessed, organized and analyzed, contact the storage experts at GDT. For years they’ve been helping customers of all sizes, and from a wide array of industries, realize their digital transformation goals by designing and deploying innovative, cutting-edge solutions that shape their organizations and help them realize positive business outcomes. Contact them at SolutionsArchitects@gdt.com or at Engineering@gdt.com. They’d love to hear from you.

You can read more about how to digitally transform your infrastructure here:

Data growth— how to store all those 1’s and 0’s

Storage—software style

Thank God I’m a country boy…as long as I get broadband

A road less traveled…than you’d think

The four (4) horsemen of HCI

Who doesn’t want to Modernize?

Workshops uncover insights into the state of IT and Digital Transformation

What is Digital Transformation?

The only thing we have to fear is…definitely not automation

Without application performance monitoring, your IoT goals may be MIA

When implementing a new technology, don’t forget this word

Automation and Autonomics—the difference is more than just a few letters

Is Blockchain a lie detector for the digital age?

If you fall victim to it, you won’t end up marking it as “like”

They were discovered on Google Play, but this is no game

Blockchain; it’s more than just Bitcoin

When being disruptive is a good thing

Wish I may or wish I might, run which apps either on- or off-site?

By Richard Arneson

Utilizing a hybrid IT architecture—running some apps on-site, others in the cloud—may be far more common than you’d think. According to a recent survey of almost a thousand (<1,000) IT professionals, ninety-six percent (96%) use the cloud.

If you’re sold on the benefits of utilizing a hybrid IT infrastructure, you’ll no doubt hear from the on-prem naysayers who insist that traditional infrastructures can’t meet the ever-changing demands of the modern business world. Oh, and that purchasing everything upfront is cost prohibitive and not sustainable.

The cloud, they’ll insist, is the only thing that truly provides the rapid innovation needed today. They may act as if on-prem solutions have gone the way of the pager. Not so. Many new technologies are available that improve agility and lower costs, such as, to name a few, hyperconvergence, flash storage and containerization.

But here’s one (1) thing everyone will agree on—if you’re moving to a hybrid IT infrastructure, you’ll be faced with determining which applications should move to the cloud and which should remain on-prem. The following workloads are what many organizations and experts have determined aren’t ready to leave the on-prem nest.

CRMs, ERMs, Supply Chain

Revenue generation. The backbone of your organization’s IT infrastructure. Both define CRMs, ERMs and Supply Chain Management applications. Keeping them on-site will provide easier monitoring and management, and help ensure they’re up and running. If they aren’t, your organization may soon hemorrhage cash and lose loads of revenue.

Engineering

Engineering or more technical applications like those, for instance, that automate business processes can be rife with custom requirements that require a particular level of expertise and skill sets. And because they frequently involve intellectual property and may be subject to compliance or regulatory laws, security is critically important.

Unstructured Data Analytics

Any software that is used to gather and analyze the unstructured data that we use every day, such as email, rich media, reporting, invoicing, et al., is often maintained on-prem. While unstructured data doesn’t fit into traditional row and column databases, it comprises over seventy-five percent (75%) of the total data that organizations process on a daily basis.  Unstructured data analytics tools are used to turn that data into actionable insights. Running these tools on-site can provide the ability to respond to business needs faster and more securely.

Structured Data Management and Analytics

Structured data management software manages defined data kept in one (1) or more databases. Running workloads related to the management and analytics for structured data involves sensitive information, and running it in the public cloud can increase data exposure risk(s). And with the need to address regulatory and compliance laws, running these tools on-site can provide more security and peace of mind.

Moving to a hybrid IT infrastructure? Talk to these folks first

Migrating to a hybrid IT infrastructure is a big move. If you don’t have the right skill sets, expertise and experience on staff, the many benefits you’re counting on could fall well short of expectations.

That’s why turning to experienced experts like those at GDT can help make your hybrid IT dreams a reality. They hold the highest certifications in the industry and are experienced delivering and optimizing solutions from the IT industry’s premier, best-of-breed providers. They can be reached at CloudTeam@gdt.com. They’d love to hear from you.

If you’d like to learn more about the cloud─migrating to it, things to consider prior to a migration, or a host of other cloud-related topics—you can find them here:

Survey reveals organizations see the need to utilize more than one (1) public cloud service provider

Government Cloud adoption is growing, but at the rate of other industries?

The 6 (correctly spelled) R’s of a cloud migration

Are you Cloud Ready?

Calculating the costs–soft and hard–of a cloud migration

Migrating to the Cloud? Consider the following

And learn how GDT’s Cloud Team helped these organizations get the most out of their cloud deployments:

Assessment turns cloud hopes into a reality

A utility company reaps the benefits of the cloud…finally

A company’s cloud goals were trumped by a poor architecture

Government Agency turns to GDT to migrate mission critical apps to the cloud

If you doubled down on Russia, your bet’s safe

By Richard Arneson

In the event you’re keeping score at home, Russia sits atop the medal standings at the Hacker Olympics. And there’s no indication they’ll lose that top spot any time soon. Unfortunately, these olympics don’t happen every four (4) years. It’s a race that will never end.

In its latest threat report, CrowdStrike, the organization that uncovered Russia’s Democratic National Committee hacking prior to the 2016 election, has determined that Russia is leading the cybercrime pack against nearest competitors North Korea, Chinese and Iran.

It’s a timed event

This Hacker Olympics is comprised of only one (1) event, and it’s measured not with judges or style points, but in time. In this case, it’s called “Breakout time,” a measurement CrowdStrike created that refers to the time between the breach of the initial point of entry (starting line) to the network (finish line). Once the network is reached, the data theft can begin (we’ll call that the medal podium).

According to CrowdStrike, the average breakout time in 2018 was 4 hours and 37 minutes. They garnered these results from analyzing over 30,000 thwarted breach attempts among its customer base. Russia’s gold medal-winning speed? A frightening 18 minutes and 49 seconds.

Here’s how the others fared:

Silver Medal—North Korea (2 hours and 20 minutes)

Bronze Medal—China (4 hours)

Dishonorable Mentions—Iran (5 hours and 9 minutes); Organized criminal groups (9 hours and 42 minutes)

Eight times (8x) faster!

While Russia’s stunningly fast time is impressive—or, rather, scary—what’s probably more concerning is China’s precipitous increase targeting the United States. Russia’s attacks weren’t as prejudiced as China’s and evenly spanned the globe (lucky globe). North Korea’s were highly focused on revenue-generating attacks, and Iran’s were more focused on the Middle East and North African countries, primarily those also in the Gulf Cooperation Council (GCC).

Don’t be a statistic in the Hacker Olympics

To find out how to secure your organization’s network and protect its mission critical data, contact GDT’s tenured and talented engineers and security analysts at SOC@GDT.com. From their Security and Network Operations Centers, they manage, monitor and protect the networks of companies of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

If you want more information about network security, cyberattack,s and how to stay at least one (1) step ahead of the bad guys, read more about it here:

What happens in an ATM, doesn’t always stay in an ATM

Google launches itself into cybersecurity space

Getting Stuffed at Dunkin’ Donuts?

Security Myths Debunked

State of the Union address focuses on technology–briefly

The technology arms race was just amped up

Apparently, cyber attackers also consider imitation to be the sincerest form of flattery

Last week’s DHS “alert” upgraded to “an emergency directive”

The Collection #1 data breach—sit down first; the numbers are pretty scary

Shutdown affects more than workers

DDoS Attacks will deny a Massachusetts Man Ten (10) years of Freedom

Phishing for Apples

This isn’t fake news

Don’t get blinded by binge-watching

Mo Money, Mo Technology―Taylor Swift uses facial recognition at concerts

Step aside all ye crimes—there’s a new king in town

Q & A for a Q & A website: Quora, what happened?

They were discovered on Google Play, but this is no game

And in this corner…

Elections are in, but there’s one (1) tally that remains to be counted

Hiring A Hacker Probably Shouldn’t Be Part of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

Data growth—how to store all those new 1’s and 0’s

By Richard Arneson

Data growth–this chart is no exaggeration

According to a research study from InformationWeek and Interop ITX, the growth in data will accelerate by, on average, over twenty-five percent (25%). And over fifty percent (50%) of companies included in the study manage Terabytes of data, many up to ninety-nine (99) Terabytes. One percent (1%) manage over a Petabyte. And for all you data storage salespeople out there, how’s this news for building your storage sales funnel—well over half will increase their storage spend in the coming months.

One (1) of the issues that actually prevents companies from increasing that spend is the head-spinning number of options that can be utilized. They’re blinded by the storage light. Call it paralysis by analysis. But introducing you to the types of data storage you can utilize will hopefully (alliteration alert) provide the perfect prescription for your paralysis.

Cloud Storage

Using the cloud for anything is growing. It’s no different with storage. And why would it? According to a recent survey of almost a thousand (<1,000) IT professionals, it was discovered that ninety-six percent (96%) use the cloud.

Companies wanting scalability and cost savings are attracted to cloud storage. Yes, I know, everybody wants those, but if data grows precipitously, cloud storage may not be the best option. As needs increase, cost benefits can decrease. If this issue describes your organization, (NAS) Network Attached Storage may be a better option. But cloud storage is a great offsite backup solution, especially if your local storage solution fails you.

Two (2) other oft-mentioned issues with cloud storage concern security and performance, which is largely based on your Internet speed. But working with the right MSP and cloud provider should greatly mitigate these concerns.

Network Attached Storage (NAS)

By operating at the file level, NAS can connect to file-based protocols, such as NFS and CIFS, through a dedicated network appliance that manages storage and access. Devices connected to it aren’t limited to their own storage capacity, but, because the data is accessed via the network, its performance is dependent on the speed of the network and how it’s performing at the time. So, peak network usage may limit the storage performance of NAS.

Storage Area Networks (SANs)

SANs provide a centralized storage strategy that improves security, maintenance and fault tolerance. It’s a high-speed network providing access to shared data resources, and capacity can be easily, quickly and cheaply expanded. It can also mitigate some of the performance-related issues with NAS by providing multiple data paths and enhanced performance with segregated networks.

What stings some regarding SANs are upfront costs, but if you can get past that, its ability to share resources will save costs in the long run.

Direct Attached Storage (DAS)

Unlike SANs, DAS, as its name implies, is directly attached to servers, computers or other connected devices. So, storage enjoys high bandwidth and fast access speeds. But while it’s usually the cheapest storage option, it becomes less cost effective as storage requirements ramp up. Also, it’s more difficult to upgrade, as system-wide backups are required. And the data is siloed, making it more difficult to share and/or easily free up storage space. However, it’s a great option if storage requirements remain fairly constant or don’t grow exponentially.

Questions about how utilizing the right storage solution can greatly enhance your organization?

If you’d like to learn more about how to digitally transform your organization, talk to the expert solutions architects and engineers at GDT. For years they’ve been helping customers of all sizes, and from a wide array of industries, realize their digital transformation goals by designing and deploying innovative, cutting-edge solutions that shape their organizations and help them realize positive business outcomes. Contact them at SolutionsArchitects@gdt.com or at Engineering@gdt.com. They’d love to hear from you.

You can read more about how to digitally transform your infrastructure here:

Storage—software style

Thank God I’m a country boy…as long as I get broadband

A road less traveled…than you’d think

The four (4) horsemen of HCI

Who doesn’t want to Modernize?

Workshops uncover insights into the state of IT and Digital Transformation

What is Digital Transformation?

The only thing we have to fear is…definitely not automation

Without application performance monitoring, your IoT goals may be MIA

When implementing a new technology, don’t forget this word

Automation and Autonomics—the difference is more than just a few letters

Is Blockchain a lie detector for the digital age?

If you fall victim to it, you won’t end up marking it as “like”

They were discovered on Google Play, but this is no game

Blockchain; it’s more than just Bitcoin

When being disruptive is a good thing

Storage—software style

By Richard Arneson

To combine the word storage with software sounds somewhat counterintuitive. You store stuff in a container, a physical, you-can-touch-it-and-see-it container. Combining storage with software just doesn’t agree with the ear; it’s like saying, “Please pass me that Volvo.”

Up until recently, the IT industry’s version of storage brought into focus equipment, whether a large, cumbersome chassis or a 0.1-ounce thumb drive that slides into a USB port. Well, quash those images immediately. It’s 2019, and this is the IT industry; anything goes, even storage that’s defined through software.

It’s not the same thing as Storage Virtualization—but it’s close

Storage virtualization combines pools of storage resources in logical containers and allocates it where and when needed. It increases capacity and enhances performance. In a word—efficient. But we’ll get to that in a minute.

While very similar to storage virtualization, SDS takes management to the next level. SDS is storage virtualization combined with services and functionality independent from the hardware used.

With software-defined storage (SDS) comes:

Greater Agility

SDS allows storage options that can be altered as needed, and quickly. It provides tremendous flexibility for data centers. And even if your business isn’t growing, one (1) thing assuredly will—the amount of data you’ll accumulate.

SDS provides flexible deployment options and does so, in part, by allowing you to use the non-proprietary hardware of your choosing. And you can utilize existing storage infrastructures, as well, whether in the data center or delivered via the cloud. As your needs scale, so can your storage…quickly.

More Control

You can choose management rules and policies related to storage once and implement them across your entire storage infrastructure. This handles the (made up word alert) siloing of separate storage systems that don’t coordinate and communicate well. Once particular data is no longer a top priority, it can easily be pushed, or archived, to other areas based on policies set. Another name for this? Information life Cycle Management (ILM).

And tools in SDS provides analytics so you can better plan upcoming purchases. In SDS, more control can be boiled down to having the ability to store the right data in the right place, and at the right time and with the right cost. And do all of this automatically—that’s called “the right speed.”

Streamlined Efficiency

Efficiency can be thought of as cost savings, as different storage systems can be utilized as if they’re one (1). And with data automatically flowing between different storage systems based on priorities set, performance can be accelerated while lowering costs through, among other things, moving data to flash storage when performance related to it is critical. Then, once that performance is no longer a priority, it can be easily moved to cloud storage, even tape or disks.

The proliferation of data is growing exponentially. And once 5G is here, that velocity is only going to accelerate. Higher performance, greater agility, more control, and at lower costs? If it sounds too good to be true, it’s not—it’s software-defined storage (SDS).

Questions about how utilizing software-defined storage can work for your organization?

If you’d like to learn more about how to digitally transform your organization, talk to the expert solutions architects and engineers at GDT. For years they’ve been helping customers of all sizes, and from a wide array of industries, realize their digital transformation goals by designing and deploying innovative, cutting-edge solutions that shape their organizations and help them realize positive business outcomes. Contact them at SolutionsArchitects@gdt.com or at Engineering@gdt.com. They’d love to hear from you.

You can read more about how to digitally transform your infrastructure here:

Thank God I’m a country boy…as long as I get broadband

A road less traveled…than you’d think

The four (4) horsemen of HCI

Who doesn’t want to Modernize?

Workshops uncover insights into the state of IT and Digital Transformation

What is Digital Transformation?

The only thing we have to fear is…definitely not automation

Without application performance monitoring, your IoT goals may be MIA

When implementing a new technology, don’t forget this word

Automation and Autonomics—the difference is more than just a few letters

Is Blockchain a lie detector for the digital age?

If you fall victim to it, you won’t end up marking it as “like”

They were discovered on Google Play, but this is no game

Blockchain; it’s more than just Bitcoin

When being disruptive is a good thing

What happens in an ATM, doesn’t always stay in an ATM

By Richard Arneson

It’s either not talked about often or doesn’t happen that frequently. But it should come as no surprise to learn that there is malware that targets ATMs. It makes perfect sense. ATMs run software, require connectivity and are stuffed with cash. Let me say that again—they’re stuffed with cash.

The latest ATM attack is quite different than your average attack, though. Actually, it’s a lot different. The malware, named WinPot, turns ATMs owned by an unnamed, but apparently well-known, vendor into slot machines. They’re selling it on the dark web for upwards of a thousand bucks. They created an interface that crudely mimics a one-armed bandit. Dials represent each of the ATM’s four (4) cassettes, which are the areas in which the cash is held (the design is to prevent an ATM from emptying its entire contents at a single time).

It’s no game of chance

WinPot differs from a traditional slot machine in one (1) very significant way—there’s no chance or luck involved. Once the “spin” button is tapped, the cash starts flowing. And after a cassette has emptied its cash, a “scan” button instructs the ATM to look for other cassettes that are still loaded with money. The slot machine-like interface is apparently for comedic effect only.

WinPot is not the first malware to attack ATMs. In fact, it’s not even the first to combine ill-gotten gains with laughs, or at least a hacker’s version of humor. Two (2) years ago, Cutler Maker was made available on the dark web for five (5) grand. It was loaded by plugging a flash drive into an ATM USB port. The interface looked more like the menu from a 1950’s-era diner. The felon served as virtual cook and accessed ATM cassettes by pushing “Check Heat”, then extracted cash with the cleverly labeled “Start Cooking” button.

Thankfully, illegally pulling cash from ATMs is no slam dunk

Just last year, Qin Qisheng, a software engineer from China, detected an operating system weakness in ATMs used by Huaxia Bank. Apparently, the OS created a small sliver of time at midnight during which ATM withdrawals weren’t recorded. He withdrew approximately $1 million prior to being arrested. His defense? He was storing the cash in his account for safekeeping, and, once the window had been sealed shut, would return the loot. Qin may know software, but he’s no Clarence Darrow. His defense didn’t hold up in court. He was sentenced to over ten (10) years in prison.

Stay steps ahead of cyberattackers by working with these folks

To find out how to secure your organization’s network and protect its mission critical data, contact GDT’s tenured and talented engineers and security analysts at SOC@GDT.com. From their Security and Network Operations Centers, they manage, monitor and protect the networks of companies of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

If you want more information about network security, read more about it here:

Google launches itself into cybersecurity space

Getting Stuffed at Dunkin’ Donuts?

Security Myths Debunked

State of the Union address focuses on technology–briefly

The technology arms race was just amped up

Apparently, cyber attackers also consider imitation to be the sincerest form of flattery

Last week’s DHS “alert” upgraded to “an emergency directive”

The Collection #1 data breach—sit down first; the numbers are pretty scary

Shutdown affects more than workers

DDoS Attacks will deny a Massachusetts Man Ten (10) years of Freedom

Phishing for Apples

This isn’t fake news

Don’t get blinded by binge-watching

Mo Money, Mo Technology―Taylor Swift uses facial recognition at concerts

Step aside all ye crimes—there’s a new king in town

Q & A for a Q & A website: Quora, what happened?

They were discovered on Google Play, but this is no game

And in this corner…

Elections are in, but there’s one (1) tally that remains to be counted

Hiring A Hacker Probably Shouldn’t Be Part of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

Google launches itself into cybersecurity space

By Richard Arneson

As if they don’t have enough going on, Google just launched its first cybersecurity company. In the event you missed the announcement, you’re not alone. Oddly, the launch was disclosed in a January 24th blog post written by Stephen Gillett.

Alphabet, Google’s parent company, is technically responsible for lift off. The rocket’s name is Chronicle, and it’s currently being tested with several undisclosed Fortune 500 companies. Oh, yes, and Gillett is Chronicle’s new CEO.

According to Gillett, Chronicle will provide two (2) services:

  1. VirusTotal, an anti-malware intelligence service that Google purchased in 2012 and has been running since, and
  2. A cybersecurity intelligence and analytics program designed to help customers better manage and make sense of their own IT security-related data.

Like Google, Chronicle is an Alphabet subsidiary born out of X unit, also known as The Moonshot Factory, which is a department within its R & D incubator. Moonshot technologies make, in Alphabet’s words, “the world a radically better place.” It’s another way of saying that the technologies aren’t being developed to line their pockets, as much as to benefit humanity. Not that they’ll be giving away Chronicle for free. Sure, its “moonshot” origins sound noble, but Chronicle isn’t a philanthropic venture.

All that data…

Chronicle, at its core, is focused on the mountains of data that accumulates like dust on an exercise machine. They want to reduce the time it takes to discover attacks and, in an admirable act of vengeance, turn the tables on hackers. To accomplish this, Chronicle will utilize machine learning that, according to Gillett, is more advanced than anyone’s in the IT security space.

Chronicle promises to address a security-related issue that gets worse by the day—the proliferation of alerts, many false positive, that can’t be managed by the majority of infoSec teams. Chronicle will analyze alerts to help personnel better determine which are the most critical, and most likely, to represent genuine threats.

According to Gillett’s blog post, “We want…to capture and analyze security signals that have previously been too difficult and expensive to find. We are building our intelligence and analytics platform to solve this problem.”

Cybersecurity concerns? Talk to these folks

To find out how to secure your organization’s network and protect its mission critical data, contact GDT’s tenured and talented engineers and security analysts at SOC@GDT.com. From their Security and Network Operations Centers, they manage, monitor and protect the networks of companies of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

If you want more information about network security, read more about it here:

Getting Stuffed at Dunkin’ Donuts?

Security Myths Debunked

State of the Union address focuses on technology–briefly

The technology arms race was just amped up

Apparently, cyber attackers also consider imitation to be the sincerest form of flattery

Last week’s DHS “alert” upgraded to “an emergency directive”

The Collection #1 data breach—sit down first; the numbers are pretty scary

Shutdown affects more than workers

DDoS Attacks will deny a Massachusetts Man Ten (10) years of Freedom

Phishing for Apples

This isn’t fake news

Don’t get blinded by binge-watching

Mo Money, Mo Technology―Taylor Swift uses facial recognition at concerts

Step aside all ye crimes—there’s a new king in town

Q & A for a Q & A website: Quora, what happened?

They were discovered on Google Play, but this is no game

And in this corner…

Elections are in, but there’s one (1) tally that remains to be counted

Hiring A Hacker Probably Shouldn’t Be Part of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

Thank God I’m a Country Boy…as long as I get broadband

By Richard Arneson

Ah, yes, life on the farm. The pastoral setting. The din of tractors backfiring, chickens clucking and cows mooing. And don’t forget the sound of farmers cursing their poor Internet connectivity.

Worry no more rural America! The U.S. government has introduced a plan called the American Broadband Initiative (ABI). Its goals is to ramp up broadband deployment to millions of Americans in rural communities. And over twenty (20) government agencies have joined the cause, which was introduced and signed by President Trump last month.

The ABI’s 3-legged stool

The ABI, as outlined in a White House report, will meet its goals by adhering to the following three (3) elements

  1. Utilize existing federal assets, such as buildings, towers and land to lower the cost of broadband buildouts. In addition, it will encourage the broadband service providers to expand their infrastructures to include rural America.
  2. Make it easier for the telecom companies to obtain necessary permitting to build out those infrastructures. The ABI will loosen the federal rights-of-way for broadband providers and allow them to leverage the federal assets (as listed above) to speed up the deployment of broadband access. Call this one “Reduce the red tape.”
  3. Maximize the use of federal funds to better target areas at need, provide more consistency and deliver incentives for state and local governments that efficiently utilize these federal funds.

According to the report, which was authored by Commerce Secretary Wilbur Ross and Agriculture Secretary Sonny Perdue, “While the government serves an important role, we strongly believe that nothing creates innovation more effectively than unleashing the free market economy from burdensome government regulations. Toward that end, the reforms outlined in this report are dedicated to removing regulatory barriers and expanding opportunities for successful private-sector capital investments.”

It is set to begin last December— say what?

In December, the Agriculture Department set aside over six hundred million dollars ($600 million) for grants and loans that will advance the ABI’s goals. And last week the Interior Department announced measures to increase broadband on federally-managed land, which includes allowing broadband service providers to deploy wireless and wired infrastructures on existing communications towers. It’s a significant step considering the federal government manages over twenty percent (20%) of the United States’ acreage, the majority of which is located in rural America. And to support this, they’ve released a mapping tool so service providers can locate the federal government’s infrastructure locations.

And if you thought the federal government had already taken on this initiative…

The ABI is a continuation of two (2) earlier efforts to hasten broadband connectivity to rural America—the 2-year-old Broadband Interagency Working Group and the Connect America Fund, which was set up by the Federal Communications Commission (FCC) in 2014. In December, the FCC allotted an additional sixty-seven million dollars ($67 million) annually in support of it.

The ABI is a good first step. Err, ahh, 3rd step. Its success, though, depends on whether the service providers believe that building out their broadband infrastructure to rural America will turn a profit. Here’s hoping it will.

Questions about ensuring your infrastructure is optimally working for you?

If you’d like to learn more about how to digitally transform your organization, talk to the expert solutions architects and engineers at GDT. For years they’ve been helping customers of all sizes, and from a wide array of industries, realize their digital transformation goals by designing and deploying innovative, cutting-edge solutions that shape their organizations and help them realize positive business outcomes. Contact them at SolutionsArchitects@gdt.com or at Engineering@gdt.com. They’d love to hear from you.

You can read more about how to digitally transform your infrastructure here:

A road less traveled…than you’d think

The four (4) horsemen of HCI

Who doesn’t want to Modernize?

Workshops uncover insights into the state of IT and Digital Transformation

What is Digital Transformation?

The only thing we have to fear is…definitely not automation

Without application performance monitoring, your IoT goals may be MIA

When implementing a new technology, don’t forget this word

Automation and Autonomics—the difference is more than just a few letters

Is Blockchain a lie detector for the digital age?

If you fall victim to it, you won’t end up marking it as “like”

They were discovered on Google Play, but this is no game

Blockchain; it’s more than just Bitcoin

When being disruptive is a good thing

DCIM—helping the Offensive Linemen of IT get their due

By Richard Arneson

You usually don’t hear about it until you’re touring a data center in person, if even then. While starstruck by the vast collection of colored cables, twinkling equipment and shiny racks, you probably didn’t give it a thought, much less any credit. Let’s change that. So today—Valentine’s Day—let’s give some much-earned love to the data center’s physical facility. You know, the power, cooling, floor space, environmental control, etc.

DCIM, which stands for Data Center Infrastructure Management, is a software suite that serves two (2) masters. It combines the more glamorous IT with physical assets, the oft-ignored, but vitally important “offensive linemen of the data center.”

A quarterback without a solid offensive line will spend the majority of his time on his backside, regardless of his arm strength, good looks or how high he was selected in the draft. His effectiveness depends implicitly on his offensive linemen, who rarely get due credit or big endorsement dollars. But combine the two (2) and you’ve got a team. And that’s a word that can be aptly used to describe DCIM. It’s a team approach for addressing what all data centers experience on a regular basis—change.

Data center change is more than adding equipment and hoping there’s space, A/C and outlets to accommodate it. DCIM allows organizations to manage and deftly address ever-changing workflows and track costs associated with equipment moves, adds and changes. And it simplifies operational complexities, so the overall value of your data center can be comprehensively realized and managed.

Interaction, Communication—Hut, Hut, Hike

If a quarterback doesn’t call a play, offensive linemen may drop back to pass block on a running play; not good—probably a loss of yardage. A DCIM software suite blends IT and facilities, taking both into account, so organizations can optimally deploy data center assets. It allows management to quickly capture an overview of the data center’s functionality and the health of its systems. DCIM can significantly impact cost structures. For instance, energy consumption can be calculated upfront instead of waiting for a bill at the end of the month.

Data center facilities are costly, but cost savings can be gained by identifying unused resources or optimizing existing capacity. Not having to buy additional capacity when it’s not needed is good for the bottom line.

DCIM satisfies more than IT and Facilities

While it’s obvious that IT organizations will be key players in the DCIM game, it’s only slightly less obvious that facilities will be fully engaged. But don’t forget what it takes for all of this to work—money. Yes, finance departments will have an eye on a DCIM solution. They’ll probably want to compare costs against business value—DCIM can help provide that.

And don’t forget those folks in the corner offices. Executives are more focused on IT than ever before. IT is no longer just about pushing data across the network, backing it up and manning a help desk. More and more executives are looking to IT departments for revenue generation.

Questions about DCIM, Data Center Modernization and Digital Transformation?

If you’re wondering how to modernize your data center and utilize the management tools needed to ensure you enjoy its true value, call on the expert solutions architects and engineers at GDT. For years they’ve been helping customers of all sizes, and from a wide array of industries, realize their digital transformation goals by designing and deploying innovative, cutting-edge solutions that shape their organizations and help them realize positive business outcomes. Contact them at SolutionsArchitects@gdt.com or at Engineering@gdt.com. They’d love to hear from you.

You can read more about how to digitally transform your infrastructure here:

The Four (4) Horsemen of HCI

Who doesn’t want to Modernize?

Workshops uncover insights into the state of IT and Digital Transformation

What is Digital Transformation?

The only thing we have to fear is…definitely not automation

Without application performance monitoring, your IoT goals may be MIA

When implementing a new technology, don’t forget this word

Automation and Autonomics—the difference is more than just a few letters

Is Blockchain a lie detector for the digital age?

If you fall victim to it, you won’t end up marking it as “like”

They were discovered on Google Play, but this is no game

Blockchain; it’s more than just Bitcoin

When being disruptive is a good thing

Security Myths Debunked

By Richard Arneson

It could be argued that network security is similar to the average man’s relationship with the doctor’s appointment. It isn’t seen as important until something goes wrong.

Appointment-fearing men aside, the following are seen as the two (2) most common myths concerning network security, at least according to Ciaran Martin, the CEO of the National Cyber Security Center, which is the cyber arm of Great Britain’s Government Communications Headquarters (GCHQ). Martin went on to issue this admonishment: “There isn’t much of an excuse any longer for not knowing about security as a business risk.” Nobody can argue his point, even though many don’t abide by it.

Myth One (1)—cyberattacks are targeted

While it’s true that cyberattacks are becoming slightly more targeted, the majority—as in just slightly under a hundred percent (<100%)
— aren’t prejudiced. They don’t care one (1) whit who they’ve trap in their web of deceit, lies and downright evil. Many companies still feel they’ve been flying under the radar due to the size or their organization or the industry in which they work. They think their anonymity somehow shields them from attacks. According to Martin, they don’t believe they’ll ever appear in the crosshairs of a cyberattack. “Tell that,” Martin said, “to the Western business leaders hit by NotPetya in the summer of 2017.” That malware attack, which was originally launched by Russia to infect Ukrainian networks, quickly spread throughout the world like a California wildfire. The damages to businesses globally reached around $300M. They’re rarely targeted! Myth Busted!

Myth Two (2) —cyber security is just too darn complicated

While this myth may sound like an April Fool’s joke, it’s not. Other than it being February 13th, it’s astounding, according to Martin, how many C-level executives share this sentiment. According to Martin, “When I view businesses in the UK and around the world, I’m often amazed by the sheer complexity and sophistication of the businesses and the risks that they manage.

“A company that can extract stuff from way below the ground, a company that can transport fragile goods to the other end of the planet in a really short period of time, a company that can process billions of financial transactions every hour is more than capable of managing cyber security risk.”

While this isn’t a security panacea, your company’s security posture can be substantially strengthened by ensuring software and systems are up-to-date. That doesn’t sound so complicated.

Here’s another easy security measure to implement: Conduct security awareness training. Create policies concerning network security, provide accompanying training, and heavily stress the importance of strictly adhering to them. For every employee (and there could be hundreds, maybe thousands) who rolls their eyes at what might seem like commonsensical security training, all it takes is that one individual who doesn’t pay attention.

If nothing else, communicate this to employees: Make sure they ask themselves, prior to opening a link or attachment:

  • Do I know the sender? 
  • Do I really need to open this link or file?

If they don’t consider these questions, your organization could be ripe for the picking. Myth Busted!

Let these folks take the complexity out of your security posture

To find out how to secure your organization’s network and protect its mission critical data, contact GDT’s tenured and talented engineers and security analysts at SOC@GDT.com. From their Security and Network Operations Centers, they manage, monitor and protect the networks of companies of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

If you want more information about network security, check out the following articles:

State of the Union address focuses on technology–briefly

The technology arms race was just amped up

Apparently, cyber attackers also consider imitation to be the sincerest form of flattery

Last week’s DHS “alert” upgraded to “an emergency directive”

The Collection #1 data breach—sit down first; the numbers are pretty scary

Shutdown affects more than workers

DDoS Attacks will deny a Massachusetts Man Ten (10) years of Freedom

Phishing for Apples

This isn’t fake news

Don’t get blinded by binge-watching

Mo Money, Mo Technology―Taylor Swift uses facial recognition at concerts

Step aside all ye crimes—there’s a new king in town

Q & A for a Q & A website: Quora, what happened?

They were discovered on Google Play, but this is no game

And in this corner…

Elections are in, but there’s one (1) tally that remains to be counted

Hiring A Hacker Probably Shouldn’t Be Part of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

Bleichenbacher—the man, the legend, the TLS attack

By Richard Arneson

No, no, this is a good man.

According to a technical paper published by a team of academics, there’s a new cryptographic attack lurking. It allows attackers to intercept data believed to be secure through TLS (Transport Layer Security), which provides authentication encryption between network devices. The researchers who identified it have dubbed it ROBOT, an acronym for Return of Bleichenbacher’s Oracle Threat.

Technically, it’s not really new

As is often the case, attackers, who often lack creativity, ingenuity, or both, borrow from previously and successfully launched evil. In this particular case, the miscreants filched from the Bleichenbacher attack, which was launched last century (1998) and victimized SSL servers. In it, the attackers sent encrypted text, known as ciphertext, to be decrypted. The decrypted results they got back picked subsequent ciphertexts, and so on, and so on…

The attacker performs decryption through RSA, which is a key-exchange algorithm utilized by TLS and its predecessor, SSL (Secure Socket Layer). Along with key-exchange algorithms, TLS and SSL utilize symmetric-key algorithms, which are faster than their counterpart, but not quite on an encryption par. The key-exchange algorithms help determine the symmetric keys to use during a TLS or SSL session. Key-exchange algorithms are like a mediator who determines whether you’d like to converse with somebody, even though both of you speak a range of languages. Once you both agree that a conversation is merited, symmetric keys represent the language in which you’d both like to converse. The symmetric keys are created for agreed upon encryption and decryption, including any detected tampering.

Before you go hating on the name Bleichenbacher, (his Christian name Daniel) know that he is one (1) of the good guys. He is the erstwhile Bell Labs researcher who discovered the attack over two (2) decades ago.

These latest plagiarists come from a long line of scoundrels, though. There have been over ten (10) attacks that borrowed from the original Bleichenbacher attack. They were all effective to some degree, hence the imitations.

Here’s why it’s working

The authors of the TLS protocol had their hearts in the right place, but their retrofitted measures to make guessing the RSA decryption key more difficult have fallen short. Essentially, they’ve patched worn tires instead of buying new ones. What was needed was the replacement of the insecure RSA algorithm. Now, instead, many TSL-capable routers, servers, firewall and VPNs are still vulnerable.

The hits just keep coming

Not to be a downer, but it’s important to note that this latest attack works against Google’s new QUIC encryption protocol. And how’s this for irony? Google is Daniel Bleichenbacher’s current employer.

 Security Experts with the answers

To find out how to secure your organization’s network and protect its mission critical data, contact GDT’s tenured and talented engineers and security analysts at SOC@GDT.com. From their Security and Network Operations Centers, they manage, monitor and protect the networks of companies of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

If you want more information about network security, check out the following articles:

State of the Union address focuses on technology–briefly

The technology arms race was just amped up

Apparently, cyber attackers also consider imitation to be the sincerest form of flattery

Last week’s DHS “alert” upgraded to “an emergency directive”

The Collection #1 data breach—sit down first; the numbers are pretty scary

Shutdown affects more than workers

DDoS Attacks will deny a Massachusetts Man Ten (10) years of Freedom

Phishing for Apples

This isn’t fake news

Don’t get blinded by binge-watching

Mo Money, Mo Technology―Taylor Swift uses facial recognition at concerts

Step aside all ye crimes—there’s a new king in town

Q & A for a Q & A website: Quora, what happened?

They were discovered on Google Play, but this is no game

And in this corner…

Elections are in, but there’s one (1) tally that remains to be counted

Hiring A Hacker Probably Shouldn’t Be Part of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

State of the Union address focuses on technology—briefly

By Richard Arneson

If you missed it, you’re probably not the only one. It was fleeting, but if it slipped past you, technology was a focal point for one (1) brief, shining moment during last night’s State of the Union address. President Trump alluded to technology when he mentioned the need to increase the federal government’s “investments in the cutting-edge industries of the future.” Given the technological arms race between China and, well, the rest of the world, it’s a safe bet that 5G and AI were on his mind.

Executive Orders are waiting in the wings

The Wall Street Journal reported that Trump is preparing a number of executive orders to ramp up 5G and AI (Artificial Intelligence). While we’ll have to wait patiently to learn what specifically will be addressed, the Journal reports states that, according to administration officials, the orders will involve more government resources to advance AI and nudge private companies to enter the race to 5G.

According to Michael Kratsios, a White House technology policy aide, Trump’s overarching technology-related goal is to help ensure that American innovation will remain the envy of the world for generations to come.

Without mentioning the world’s most populated country, Trump’s commitment is clearly aimed at better competing against China, which is, according to most industry analysts both here and abroad, the far and away leader in the race to 5G. That’s not to say security-related issues are playing second fiddle, though. It’s a widely held suspicion that companies utilizing telecom equipment from China—most specifically equipment manufactured by Huawei or ZTE—are opening the door for Chinese espionage.

The United States and several Western European countries are mulling over legislation that would ban equipment manufactured by Huawei or ZTE. On Wednesday, Rob Strayer, the deputy assistance secretary for cyber and international communications and information policy, warned countries that purchasing Huawei networking gear would expand China’s surveillance capabilities to all four (4) corners of the world. Strayer warned that by using its massive 5G presence, Huawei would be poised to steal trillions of dollars in intellectual property and more easily deploy malware and attack competitors’ networks.

Can you afford to not talk to Security Experts?

To find out how to secure your organization’s network and protect its mission critical data, contact GDT’s tenured and talented engineers and security analysts at SOC@GDT.com. From their Security and Network Operations Centers, they manage, monitor and protect the networks of companies of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

If you want more information about network security, check out the following articles:

The technology arms race was just amped up

Apparently, cyber attackers also consider imitation to be the sincerest form of flattery

Last week’s DHS “alert” upgraded to “an emergency directive”

The Collection #1 data breach—sit down first; the numbers are pretty scary

Shutdown affects more than workers

DDoS Attacks will deny a Massachusetts Man Ten (10) years of Freedom

Phishing for Apples

This isn’t fake news

Don’t get blinded by binge-watching

Mo Money, Mo Technology―Taylor Swift uses facial recognition at concerts

Step aside all ye crimes—there’s a new king in town

Q & A for a Q & A website: Quora, what happened?

They were discovered on Google Play, but this is no game

And in this corner…

Elections are in, but there’s one (1) tally that remains to be counted

Hiring A Hacker Probably Shouldn’t Be Part of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

A road less traveled…than you’d think

By Richard Arneson

According to a recent study by a New York-based IT consultancy firm that works exclusively with Fortune 1,000 corporations, large companies aren’t transforming into truly data-driven organizations as fast as you’d suspect. The big boys, as it turns out, are a little behind the curve when it comes to utilizing data and analytics to help drive their organizations. Apparently, the road to digital transformation is paved with fewer Fortune 1,000 logos than you’d think.

While determining the degree at which organizations are data-driven is subjective, its meaning isn’t. In its basic of definitions, data-driven refers to the management of captured data to help, through analytics, develop business- driving and revenue-generating strategies and initiatives. Let’s face it, hunch-based decisions aren’t as sound as those derived through analytics.

The Study

The survey included sixty-four (64) C-Level technology executives from some of the world’s largest corporations—the biggies, ones we’ve all heard of, and many whose products we use daily. But it’s definitely not that they don’t regard becoming data-driven as highly important. It’s just that a spate of obstacles, both internal and external, have hamstrung their efforts.

Their impeded journeys aren’t due to a lack of spend, though. Over ninety percent (>90%) of those surveyed reported that their AI and Big Data spends are growing, and over fifty percent (>50%) said their investments in both have exceeded over $50 million. And respondents confirmed that they’re building organizations specifically to address them. In fact, almost seventy percent (<70%) currently have a Chief Data Officer. But here’s a big issue: seventy-five percent (75%) fear that moving too aggressively toward Big Data and AI may kink up their operations. They’ve spent the money, hired the driver, but they’re skittish about getting the race car onto the track.

What’s holding them back?

Evolving into a data-driven organization, especially when employing hundreds of thousands of workers in dozens of countries and spanning several continents, is no mean feat. It’s a slow process, even painfully so. Nobody would argue that point. But the slow migrations can be chalked up to more than a simple “It takes a while” retort. Over forty percent (>40%) stated that their organization isn’t properly and cohesively aligned to become data-driven, and almost twenty-five percent (<25%) said that cultural resistance is hampering their speed to digital transformation. Interestingly, though, just over seven percent (>7%) said that technology (yes, technology!) didn’t present any of the primary challenges. Here’s what did—business adoption. Almost eighty percent (<80%) cited it as their greatest challenge.

Another issue that may be holding them back is immediate need to secure revenue, which can easily push data-driven initiatives to the back burner. They know how digital transformation will enhance and advance their organization, but revenue will always hold the trump card.

Additional, and rather surprising, findings

One (1) of the things that’s surprising is that the percentage of respondents who identified their organization as being data-driven has dropped in each of the past three (3) years, from thirty-seven percent (37%) in 2017 to a skosh over thirty percent (30%) today. Here are a few more shockers:

Over seventy percent (>70%) admitted that they haven’t yet created a “data culture”, almost seventy percent (<70%) said that they have developed a data-driven organization, over fifty percent (>50%) don’t consider and treat data as a corporate asset, and over fifty percent (>50%) aren’t utilizing data and related analytics to help them become more competitive in the marketplace.

Questions about how to transform your organization into one (1) that’s data-driven?

If you’d like to learn more about how AI, Big Data and Analytics can digitally transform your organization, talk to the expert solutions architects and engineers at GDT. For years they’ve been helping customers of all sizes, and from a wide array of industries, realize their digital transformation goals by designing and deploying innovative, cutting-edge solutions that shape their organizations and help them realize positive business outcomes. Contact them at SolutionsArchitects@gdt.com or at Engineering@gdt.com. They’d love to hear from you.

You can read more about how to digitally transform your infrastructure here:

The four (4) horsemen of HCI

Who doesn’t want to Modernize?

Workshops uncover insights into the state of IT and Digital Transformation

What is Digital Transformation?

The only thing we have to fear is…definitely not automation

Without application performance monitoring, your IoT goals may be MIA

When implementing a new technology, don’t forget this word

Automation and Autonomics—the difference is more than just a few letters

Is Blockchain a lie detector for the digital age?

If you fall victim to it, you won’t end up marking it as “like”

They were discovered on Google Play, but this is no game

Blockchain; it’s more than just Bitcoin

When being disruptive is a good thing

What’s in store for IoT in 2019

By Richard Arneson



Prognostications that IoT will become more pervasive in our daily lives is like predicting the New England Patriots will win another Super Bowl. Like it or not, both are going to happen. But the question is, how exactly will IoT become more pervasive? The following predictions aren’t exactly going out on a limb, but they inch slightly closer to it.

Here a few of the ways IoT will become more ubiquitous in our day-to-day lives in 2019:

Voice Control

You better get those voice lessons out of the way soon. If enunciating is an issue, you may find IoT more frustrating in the near future. Sure, we’re used to carefully pronouncing words for Siri and Alexa, but get ready for a host of new IoT devices that will be voice-controlled. Take the automotive industry, for instance. It would be difficult to find an automotive manufacturer that isn’t developing virtual assistants to help with much of the functionality that still relies on manual operation.

And Natural Language Processing, which is a subset, of sorts, of Artificial Intelligence (AI), is helping bridge gaps between computers and the human voice. As much as we’d like them to, computers can’t interpret what we’re trying to say. But with advances in Machine Learning (ML) and Deep Learning (DL), computers are becoming more equipped to perform translations, including understanding semantics.

When that fifth G finally gets here…

This is the year when 5G is targeted for wide-scale unveiling. It will offer speeds up to twenty times (20x) faster than the sluggish mobile networks we’re currently using. Guess what IoT really needs to become more prevalent and influential in our daily lives? Speed. Sure, availability is a big component—if you can’t get it, how can you use it, right?—but the speed of 5G is what will get IoT rolling like a tumbleweed in a West Texas windstorm.

AI…IoT’s collaborator, friend and protector

Artificial Intelligence (AI) and IoT go together like, to quote Forrest Gump, “peas and carrots.” IoT pulls in enormous amounts of data and AI will rely on it to, through learning algorithms, detect anomalies or outliers, and create opportunities to uncover and enhance efficiencies. And it will provide alerts to signal impending problems. And through automated threat detection, AI will help IoT become, and remain, more secure.

Smarter edges

Consider the security camera, one (1) of the superstars of IoT. Imagine how much data it collects that is useless. Hours and hours of worthless 1’s and 0’s getting pushed to the cloud or a server, even though a fraction of it will ever need to be accessed. Now imagine IoT devices that can handle computations on their own. This will free up storage space for vital, useful data, and networks will be less congested, to boot. Software and hardware will be running on the IoT device and, in the camera example, image recognition algorithms will recognize changes and only push its associated data to the cloud or storage.

Get more IoT and Smart City info from the experts

For more information about IoT and Smart City solutions, talk to the experts at GDT. Their tenured, talented solutions architects, engineers and security analysts understand how to design and deploy IoT and Smart City solutions for organizations of all sizes to help them realize more productivity, enhanced operations and greater revenue. GDT helps organizations transform their legacy environments into highly productive digital infrastructures and architectures. You can reach them at IoT@gdt.com. They’d love to hear from you.

You can read more about Smart Cities and IoT Solutions here:

Smart Sneaks

These are no dim bulbs

Why Smart Cities? It’s in the numbers

Five (5) things to consider prior to your company’s IoT journey

Without Application Performance Monitoring, your IoT goals may be MIA

How does IoT fit with SD-WAN?

GDT is leading the Smart Cities Revolution

But it’s just so exciting!

By Richard Arneson

Ever tossed aside an installation manual and went without because the excitement of a new gadget ran counter to the need to practice patience? If you answered in the affirmative, you can probably relate to the following.

To say the issue is plaguing the IT industry may be an overstatement, but it’s certainly something that’s becoming more common. Like a kid on Christmas morning, many organizations are so excited about implementing the latest and greatest technology that they’re getting ahead of themselves. They end up giving short shrift to something that, if ignored, can quickly turn that excitement into misery—having a strong, sound security posture.

Not delaying gratification may result in poor security

On Tuesday, the IDC released findings from a research project they conducted based on a survey of 1,200 IT and security executives from nine (9) countries. Among other things, it revealed this security-related nugget—while companies are quickly adopting many of the new, exciting technologies of the day, they’re dropping the ball when it comes to protecting their organizations against cyber threats. Metaphorically-speaking, they’re tossing aside the instruction manual so they can get to the good stuff sooner.

It’s certainly not a matter of organizations ignoring the need for security, however. They’re challenged with consistently applying needed levels of security across all architectures. The result are attempts to retrofit older security solutions to address new, transformative technologies. They’re bringing a knife to a gunfight. And if they’ve spent big bucks on new technology, security to protect it may be getting shorted. In fact, only half of those surveyed said that they’re expecting an increase in their security budget. That percentage is down considerably—just one (1) year ago, seventy-nine percent (79%) expected an increase in their security spend.

The research also revealed that sixty percent (60%) of respondents have experienced a breach, and thirty percent (30%) have fallen victim to at least one (1) in the past year. And remember, respondents represented nine (9) different countries. U.S.-based organizations in the study experienced higher rates—over sixty percent (65%) have been hit with a breach, and thirty-six percent (36%) have had at least one (1) in the past year. What’s odd, even disconcerting, is that only eighty-six percent (86%) of respondents acknowledged that they are vulnerable to a security threat. So, fourteen percent (14%), or a hundred sixty-eight (168) respondents, may have put down roots in a fool’s paradise.

Complexity listed as one (1) of the primary barriers against better securing organizations

The aforementioned study revealed that forty-four percent (44%) of respondents rated complexity as a key barrier to implementing data security. That’s why turning to security experts is an important first step in helping your organization protect against cyberthreats.

To find out how to secure your organization’s network and protect mission critical data, contact GDT’s tenured and talented engineers and security analysts at SOC@GDT.com. From their Security and Network Operations Centers, they manage, monitor and protect the networks of companies of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

If you want more information about network security, check out the following articles:

The technology arms race just got amped up

Apparently, cyber attackers also consider imitation to be the sincerest form of flattery

Last week’s DHS “alert” upgraded to “an emergency directive”

The Collection #1 data breach—sit down first; the numbers are pretty scary

Shutdown affects more than workers

DDoS Attacks will deny a Massachusetts Man Ten (10) years of Freedom

Phishing for Apples

This isn’t fake news

Don’t get blinded by binge-watching

Mo Money, Mo Technology―Taylor Swift uses facial recognition at concerts

Step aside all ye crimes—there’s a new king in town

Q & A for a Q & A website: Quora, what happened?

They were discovered on Google Play, but this is no game

And in this corner…

Elections are in, but there’s one (1) tally that remains to be counted

Hiring A Hacker Probably Shouldn’t Be Part of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

Just another day in the life of fiber optics

By Richard Arneson

It’s 1958. Eisenhower was finishing up his second term in The White House, NASA was created, Arnold Palmer won the first of his seven (7) major titles, and, as everyone knows, Richard Sturzebecher was tasked with creating a formula that represented light traveling through glass fiber. It was an odd request, especially considering Sturzebecher was a 2nd Lieutenant in the United States Army.

It’s been sixty (60) years since the manager of Copper Cable and Wire, a department within the U.S. Army Signal Corps Labs, decided the signal transmission issues they regularly experienced due to lightning and water were no longer bearable. They turned to Sam DiVita, the Signal Corps’ Manager of Materials Research, to develop a solution to replace the copper wire they used to transmit signals. DiVita, borrowing from years of conjecture by several notables in the industry, such as Alexander Graham Bell, was convinced that light sent down strands of glass just might work. He consulted with his engineers, who lazily opined that the glass fiber would break. Yes, they mailed it in that day.

DiVita promptly turned to Lt. Sturzebecher and yanked him from his duties to focus on the hunch. Sturzebecher was attending Signal School at the time, but had just finished his senior thesis, which detailed how he had melted several triaxial glass systems by using Silicon Oxide (SiO2). Apparently, it was a very impressive feat. Word about his thesis began to spread, and soon Sturzebecher was asked to construct a formula that supported DiVita’s hopes that glass fiber could indeed transmit light signals.

Apparently, Sturzebecher had been holding out. He already knew—or at least suspected he knew—the answer. He used a microscope to measure the refraction of SiO2 glass. The lack of glass powders present allowed more light to pass through the microscope’s slide and into his peepers. Other than developing a headache from the light’s brightness (that’s actually true), Sturzebecher discovered that the purer the SiO2 glass, the more brilliant the light. It just so happens that Sturzebecher already knew about Corning’s high purity SiO2 powder, which they had earlier developed by oxidizing pure SiCl4 (Silicon tetrachloride) into SiO2. Thanks to Sturzebecher’s brain, and the headache that afflicted it, he was able to develop his formula. As you probably guessed, Corning Glass Works won the contract.

Years later, in the early Sixties, DiVita was forced to make the discovery public due to a law allowing any research laboratory to bid on federal contracts. The high purity SiO2 cat was officially out of the bag due to a new federal bid solicitation. Federal funding of The Signal Corps’ fiber optic research continued until the mid-1980’s.

While not a household name, Sturzebecher—or any pronunciation of it—should be.

For questions, turn to these optical networking experts

If you have questions or would like more information about fiber optics or optical networking, contact GDT’s Optical Networking practice professionals at Optical@gdt.com. Composed of experienced optical engineers, solutions architects and project managers who specialize in optical networks, the GDT Optical Networking team supports some of the largest service providers and enterprises in the world. They’d love to hear from you.

For additional reading about the greatness of fiber optics, check these out:

Don’t sell fiber optics short—what it can also deliver is pretty amazing

A fiber optic first

When good fiber goes bad

Busting myths about fiber optics

Just when you thought it couldn’t get any better

And see how GDT’s talented and tenured Optical Networking team provided an optical solution, service and support for one (1) of the world’s largest social media companies:

When a redundancy plan needs…redundancy

Ask not what you can do for Wi-Fi, but what Wi-Fi can do for you

By Richard Arneson

It’s about time Wi-Fi technology pulls its own weight. It’s never fast enough and its signal strength wavers like a politician. And now it needs range extenders to boost its deficient signal. Isn’t it about time we ask Wi-Fi what else—other than connecting us to the world—it can do for us? Well, apparently that question has already been asked. The time may be coming when Wi-Fi tackles an issue that pesters us daily.

No, Wi-Fi won’t vacuum or dust; it won’t prepare meals or pay bills. No, even better—it will charge batteries. Scientists in the U.S. have developed a device that can convert those Wi-Fi-generated radio signals caroming around your house into DC current. While it may sound somewhat dangerous, it’s not—DC current, unlike the plug-in-the-wall AC type, only flows in one (1) direction. And in this case, it produces—or, rather, they hope it will one (1) day produce—just enough juice to charge electronics, like smartphones, computers, wearables, even, possibly, medical devices.

Here’s how it works

It all starts with a rectenna. Yes, a rectenna. The poorly named device was developed from a semiconductor a few atoms—yes, atoms—thick. And if its microscopic size isn’t enough, it’s also flexible and sucks up radio signals like a Dyson vacuum. While all radio signals come in the form of very high-frequency AC current, the current generated comes in tiny, minuscule amounts. From there, the radio signals gets converted into battery-charging DC current.

Its Future is so bright

One of the key scientists working on the rectenna is Dr. Tomas Palacios, director of the Massachusetts Institute of Technology and Microsystems Technology Laboratories Centre for Graphene Devices and 2D Systems. He envisions living in a battery-free world. And given the material’s flexibility, he imagines a day when it could encase structures of all shapes and sizes like shrink wrap.

According to Dr. Palacios, “What if we could develop electronic systems that we wrap around a bridge or cover an entire highway, or the walls of our office and bring electronic intelligence to everything around us? We have come up with….a way to bring intelligence to every object around us.”

When will the rectenna make its first public appearance?

Hang tight, it’s not ready for prime time yet. And hopefully when it is, it will be given a better name. In tests, the rectenna can generate around forty (40) microwatts of power when exposed to Wi-Fi signals, which transmit at around a hundred fifty (150) microwatts. While forty (40) microwatts is enough power to light a simple mobile display or activate silicon chips, it’s not ready to re-charge those devices we rely on daily. Not to be downer, but a microwatt is one millionth of a watt and, on average, computers consume about twenty-five (25) watts of power. Do the math. It’s not there yet, but hang tight. Remember, this is technology. Things move pretty fast.

Got IoT or Smart City Questions?

Talk to the experts at GDT. Their tenured, talented solutions architects, engineers and security analysts understand how to design and deploy IoT and Smart City solutions for organizations of all sizes to help them realize more productivity, enhanced operations and greater revenue. GDT helps organizations transform their legacy environments into highly productive digital infrastructures and architectures. You can reach them at IoT@gdt.com. They’d love to hear from you.

You can read more about Smart Cities and IoT Solutions here:

These are no dim bulbs

Why Smart Cities? It’s in the numbers

Five (5) things to consider prior to your company’s IoT journey

Without Application Performance Monitoring, your IoT goals may be MIA

How does IoT fit with SD-WAN?

GDT is leading the Smart Cities Revolution

The technology arms race was just amped up

By Richard Arneson

On Monday, the Department of Justice (DOJ) held a press conference to announce that it is seeking criminal charges against Huawei, China’s mobile manufacturing giant. The company and its CFO, Meng Wanzhou, who is the daughter of Huawei founder and president Ren Zhonghe, are accused of bank and wire fraud, money laundering and conspiracy. In addition, Huawei is accused of obstructing justice.

At the news conference, FBI head Christopher Wray stated that the charges “lay bare Huawei’s alleged blatant disregard for the laws of our country and standard global business practices.” Not surprisingly, Huawei insists it’s innocent of all charges.

Wire Fraud

The DOJ claims that Huawei, Meng and a Hong Kong-based Huawei subsidiary named Skycom Technologies, committed wire fraud by violating the United States’ trade sanctions against Iran.

Stealing Trade Secrets

In 2014, T-Mobile, the number three (3) wireless service provider in the U.S., slapped Huawei with a civil suit. While a jury ruled in favor of Huawei, determining that T-Mobile didn’t suffer damages and Huawei didn’t engage in willful or malicious conduct, the DOJ wasn’t satisfied with the ruling. They’re convinced that Huawei stole information related to Tappy, a robot T-Mobile uses to test its smartphones.

Meng’s Charges

Meng, who was arrested in Canada last month at the behest of the United States government, is accused of violating the aforementioned trade sanctions against Iran. After her arrest, the United States filed extradition paperwork, but Meng remains in Canada. A three-day bail hearing in Vancouver resulted in the court’s ruling that she was indeed a flight risk. While she was allowed to be released after posting a $7.2 million dollar bail, she is being closely monitored by the Canadian government and sports an electronic ankle bracelet.

Huawei’s tough year will probably get tougher

Reportedly, the White House is preparing an executive order to bar U.S. firms from using equipment supplied by Huawei and ZTE, another major Chinese telecom equipment manufacturer.

In addition, draft legislation is making the rounds in Congress that would make it illegal for U.S. companies to sell chips or other components to either Huawei or ZTE.

Both Australia and Japan have been lobbying telecom companies in their respective countries to steer clear of utilizing Huawei to advance their 5G initiatives. They fear the company’s equipment will result in spying by the Chinese government.

Security Questions? Talk to the Experts

To find out how to secure your organization’s network and protect mission critical data, contact GDT’s tenured and talented engineers and security analysts at SOC@GDT.com. From their Security and Network Operations Centers, they manage, monitor and protect the networks of companies of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

If you want more information about network security, check out the following articles:

Apparently, cyber attackers also consider imitation to be the sincerest form of flattery

Last week’s DHS “alert” upgraded to “an emergency directive”

The Collection #1 data breach—sit down first; the numbers are pretty scary

Shutdown affects more than workers

DDoS Attacks will deny a Massachusetts Man Ten (10) years of Freedom

Phishing for Apples

This isn’t fake news

Don’t get blinded by binge-watching

Mo Money, Mo Technology―Taylor Swift uses facial recognition at concerts

Step aside all ye crimes—there’s a new king in town

Q & A for a Q & A website: Quora, what happened?

They were discovered on Google Play, but this is no game

And in this corner…

Elections are in, but there’s one (1) tally that remains to be counted

Hiring A Hacker Probably Shouldn’t Be Part of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

The Four (4) Horsemen of HCI

By Richard Arneson

In his 1887 painting, The Four Horsemen of the Apocalypse, Russian artist Viktor Vasnetsov depicted the personification of characters described in Revelations, the last book of The Bible─Death, Famine, War and Conquest. Grim, I know. But roughly forty (40) years later, the term “Four Horsemen” was softened after noted sportswriter Grantland Rice used it to menacingly label the University of Notre Dame backfield, led by storied coach Knute Rockne. And now, a hundred and thirty-two (132) years later, I propose it describe the four (4) key characteristics required to fully enjoy a hyperconverged infrastructure (HCI)—Workloads, Storage, Manageability and Economics.

If you’re not using HCI, or plan to, it’s probably time to learn more about it

A 2018 study on HCI found that approximately 25% of companies were currently using hyperconvergence, and another 23% planned on moving to it by the end of that year. While that tally hasn’t been tabulated yet, there’s no question that those percentages are only going to rise. There are simply too many benefits HCI provides for that statement to be anything but true.

In addition to the many benefits HCI delivers—software-defined storage (SDS), an easier way to launch new cloud services, modernization of application development and deployment, and far more flexibility for data centers and infrastructures—it is currently providing customers, according to the study, an average of 25% in OPEX savings. That’s big.

But don’t get too far ahead of yourself. Before you can count on enjoying those benefits, the Four Horsemen of HCI need to be addressed.

1. Workloads

Prior to heading down the HCI road, you must understand the breadth and depth of your organization’s required workloads. HCI solutions have improved significantly; no longer do they simply target VDI (Virtual Desktop Infrastructure) workloads, which had fairly predictable demands. As HCI capabilities have been enhanced, enterprises have expanded their use of it. It’s evolved into a Field of Dreams-esque “If you build it, they will come” proposition. Enterprises now use it to include ERP (Enterprise Resource Planning), databases, and a variety of enterprise software and workloads running in a virtualized environment.

And all workloads aren’t created equal. Some are better for HCI than others. Latency, storage, replication, VMs and IOPS (Input/Output per second) need to be understood and evaluated.

Starting with Workloads is key; without fully understanding those deployed at your organization, the other three (3) horsemen will find themselves hamstrung.

2. Storage

Once workload evaluation is complete, it’s time to understand the storage requirements for each. It can be argued that HCI evolved as a means to deploy software-defined storage. Consideration must be given to deduplication, compression, and data protection in runtimes and space efficiencies.

In addition, it’s important to understand some of the challenges inherent with different storage options, whether HDDs or SSDs, which utilize flash memory. And in the case of SSDs, there will be the need to drill down deeper into its flavors, such as NVMe (Non-Volatile Memory Express) and SATA (Serial Advance Technology Attachment).

3. Manageability, Operations

Organizations looking to justify the deployment of an HCI solution will be faced with two key factors—cost reductions due to no longer having to purchase—or purchasing fewer—legacy storage solutions, and HCI’s ease of management. But without the ability to fully evaluate how current management products can, or should, integrate with those from an HCI solution, additional costs can result from trying to tackle this issue after the solution has been deployed.

4. Economics

When evaluating HCI, often organizations don’t take into account the full spectrum of economics related to it. HCI economics extend well beyond hardware costs, including savings in power, space and labor. And remember, an HCI solution can provide software licensing savings, as well.

Questions about HCI?

If you’re wondering how your organization can benefit from HCI, call on the expert solutions architects and engineers at GDT. For years they’ve been helping customers of all sizes, and from a wide array of industries, realize their digital transformation goals by designing and deploying innovative, cutting-edge solutions that shape their organizations and help them realize positive business outcomes. Contact them at SolutionsArchitects@gdt.com or at Engineering@gdt.com. They’d love to hear from you.

You can read more about how to digitally transform your infrastructure here:

Who doesn’t want to Modernize?

Workshops uncover insights into the state of IT and Digital Transformation

What is Digital Transformation?

The only thing we have to fear is…definitely not automation

Without application performance monitoring, your IoT goals may be MIA

When implementing a new technology, don’t forget this word

Automation and Autonomics—the difference is more than just a few letters

Is Blockchain a lie detector for the digital age?

If you fall victim to it, you won’t end up marking it as “like”

They were discovered on Google Play, but this is no game

Blockchain; it’s more than just Bitcoin

When being disruptive is a good thing

Survey reveals organizations see the need to utilize more than one (1) public cloud service provider

By Richard Arneson

You could make a strong argument that in discussions surrounding the cloud, the word Hybrid gets bandied about more than Multicloud. But a recent survey conducted at an AWS (Amazon Web Services) user conference may have you begging to differ. The survey, which was conducted by an independent firm, discovered that organizations are using multicloud strategies far more than they’re utilizing a hybrid cloud model.

What may be a little confusing as it relates to the survey is its use of the word multicloud. Multicloud doesn’t exclude private clouds, which are, of course, a required element of a hybrid cloud. Multicloud simply means that more than one (1) cloud is being used; hybrid clouds combine both public and private clouds. But for purposes of this survey, multicloud was used to mean utilizing more than one (1) of the three (3) major cloud service providers.

The survey included over three hundred (300) executives and technicians, and it was administered to help AWS better understand their cloud adoption rates and customers’ challenges related to their service. What leapt to the forefront, however, was that almost sixty percent (60%) of respondents operated a multi cloud vendor architecture. In other words, in addition to using AWS, they also utilized cloud services from competitors Microsoft Azure and Google Cloud. Cloud users, the survey uncovered, utilize a hybrid cloud model far less than multicloud by a significant margin–almost twenty-five (25) percentage points less, or approximately thirty-three percent (33%).

While ninety-seven percent (97%) of survey respondents use AWS for cloud management (unsure why it wasn’t a hundred percent (100%) at an AWS user conference), thirty-five percent (35%) also use Azure, and twenty-four percent (24%) use Google Cloud..

Other takeaways

Almost ninety percent (90%) of respondents use multiple tools to gain visibility into their cloud applications, and thirty-five percent (35%) use three (3) or more. And respondents’ biggest cloud management challenges? Thirty percent (30%) listed cost management and twenty-two percent (22%) cited security.

Moving to the cloud? It all starts with Expertise…and this team has plenty of it

Migrating to the cloud is a big move; it might be the biggest move of your IT career. If you don’t have the right cloud skill sets, expertise and experience on staff, you may soon be wondering if the cloud is all it’s cracked up to be.

That’s why turning to experienced Cloud experts like those at GDT can help make your cloud dreams a reality. They hold the highest cloud certifications in the industry and are experienced delivering and optimizing solutions from GDT’s key cloud partners―AWS, Microsoft Azure and Google Cloud. They can be reached at CloudTeam@gdt.com. They’d love to hear from you.

If you’d like to learn more about the cloud─migrating to it, things to consider prior to a migration, or a host of other cloud-related topics, you can find them here:

Government Cloud adoption is growing, but at the rate of other industries?

The 6 (correctly spelled) R’s of a cloud migration

Are you Cloud Ready?

Calculating the costs–soft and hard–of a cloud migration

Migrating to the Cloud? Consider the following

And learn how GDT’s Cloud Team helped these organizations get the most out of their cloud deployments:

A utility company reaps the benefits of the cloud…finally

A company’s cloud goals were trumped by a poor architecture

Government Agency turns to GDT to migrate mission critical apps to the cloud

Apparently, cyber attackers also consider imitation to be the sincerest form of flattery

By Richard Arneson

Phobos-the personification of the fear of ransomware

An ambitious, but apparently unoriginal, cybercrime gang is taking responsibility for a rash of malware attacks that began just prior to the Christmas holidays. They’ve named it Phobos, ostensibly taken from the name given to the personification of fear in Greek mythology. Apparently, fear wasn’t granted god status by the ancient Greeks.

The gang, which apparently forgot to name itself, was inspired by two (2) earlier and very prolific attacks: Dharma and CrySiS, the origin and meaning of its name pretty self-explanatory. But it’s obvious to the security professionals that the gang is only flattering itself—they have no doubt that the same band of reprobates is behind all three (3) attacks.

Dharma

Like Dharma, Phobos preys on open or poorly secured RDP (Remote Desktop Protocol) ports. From these weakened RDPs, Phobos sneaks and slithers into networks and launches the ransomware attack, where it begins encrypting files. It can affect files on local, mapped network and virtual machine drives.

Because it’s called ransomware, victims are soon left with this decision—should I, or shouldn’t I, pay to access my affected files, which will be locked with the .phobos extension. And, as it usually the case, they want payment in Bitcoin, the currency of choice for launchers of ransomware. (Here’s a not-so-subtle tip: DON’T PAY. It only supports and exacerbates the crime.)

It’s obvious that Phobos is Dharma-inspired—the ransom note appears exactly like the one (1) that was used by Dharma, text and typeface, and all. In addition, most of Phobos’ code is identical to Dharma’s; it’s basically a cut and paste version of the latter. And, really, why wouldn’t it mimic Dharma? In the cybercrime world, Dharma is probably 2018’s MVP in the ransomware division. Or, at the very least, it’s on the all-star team; it was arguably the most damaging ransomware of the year.

CrySiS

To prevent hurt feelings, the developers of Phobos borrowed from CrySiS, as well. Phobos is so similar to it that anti-virus software often detects Phobos as CrySis. The variants between the two (2) are so slight that many in the security industry refer to them interchangeably. Technically, though, they’re relatives, and are part of the same sinister crime family.

How are they finding victims’ RDP ports?

Sadly, there’s a marketplace for everything, even RDP ports. On underground cybercrime forums, expansive lists of RDP ports are advertised for sale at bargain basement rates. They’ve been collected by attackers via brute-force attacks, or, in many cases, by playing a game of “Guess the RDP Port.”

Secure ports, backup data, repeat often

To help mitigate the risks of falling victim to ransomware, all RDP ports must be secured with passwords. Not doing so means a simple tap on the [enter] key unlock the gate. And, of course, back up data on a regular basis.  

 If you’re already a ransomware victim, you can go to ID Ransomware and upload one (1) of the encrypted, affected files. They’ll tell you which strain you’ve been infected with, and there are probably more than you’d imagined. Currently, ID Ransomware can identify over six hundred (600) different strains of it.

Ransomware is more than locking victims’ files. By the time they realize their files have been locked, the cybercriminals may have been traipsing about networks for weeks or months—maybe longer. The ransomware may simply be their coup de grace, launched only after they’ve downloaded as much of the victim’s information as they deem worthwhile.

They’re Security Experts

To find out how to secure your organization’s network and protect mission critical data, contact GDT’s tenured and talented engineers and security analysts at SOC@GDT.com. From their Security and Network Operations Centers, they manage, monitor and protect the networks of companies of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

If you want more information about network security, check out the following articles:

Last week’s DHS “alert” upgraded to “an emergency directive”

The Collection #1 data breach—sit down first; the numbers are pretty scary

Shutdown affects more than workers

DDoS Attacks will deny a Massachusetts Man Ten (10) years of Freedom

Phishing for Apples

This isn’t fake news

Don’t get blinded by binge-watching

Mo Money, Mo Technology―Taylor Swift uses facial recognition at concerts

Step aside all ye crimes—there’s a new king in town

Q & A for a Q & A website: Quora, what happened?

They were discovered on Google Play, but this is no game

And in this corner…

Elections are in, but there’s one (1) tally that remains to be counted

Hiring A Hacker Probably Shouldn’t Be Part of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

Last week’s DHS “alert” upgraded to “an emergency directive”

By Richard Arneson

Last week, the US Department of Homeland Security (DHS) issued an alert through its US-CERT (Computer Emergency Readiness Team) division concerning repeated DNS hijacking attacks. Apparently, the alert was well deserved. Yesterday, it was significantly upgraded to an emergency directive due to a spate of recent DNS hijacking incidents that have originated in Iran.

Last week’s alert was due to a recent report published by FireEye, a California-based cybersecurity firm. In it, they provided details concerning a coordinated hacking campaign led by an Iranian cyber-espionage group that had manipulated DNS records for both government agencies and private enterprises.

The hijackers’ end game? Redirect traffic meant for email servers to malicious clones, after which they’ll scan for and gather treasured login credentials.

The Directive

The DHS emergency directive orders all government agencies to carefully audit its DNS records and look for unauthorized changes, especially any related to passwords. And it directs them to enable multi-factor authentication for those accounts that can be managed through DNS records.

In addition, they urge all IT personnel to monitor Certificate Transparency (CT) for recently-issued TLS certificates used for government domains. Also, to pay special attention to any requested by non-government employees.

How does DNS Hijacking work?

In the simplest of definitions, DNS hijacking is simply a means of redirecting traffic to a phony website. As a quick refresher, DNS was invented to translate complex, impossible-to-memorize IP addresses into something that’s far easier to remember (like, for instance, GDT.com—much easier to commit to memory than nine (9) numbers listed in no intuitive, commonsensical order).

When you type in a website’s name, the DNS is called on to direct traffic to its corresponding IP address. Usually your ISP maintains the DNS servers, and if a hacker can crack into them, let the hijacking hijinks begin. From there, they can change victims’ DNS records to point traffic toward their servers. Then their party starts. They capture as much login information as they can stomach.

Who has been affected by the attack?

Currently, the DHS, according to a report published by Cyberscoop, a multimedia provider of cybersecurity-related news and information, is aware of six (6) civilian organizations that have fallen victim to this particular DNS hijacking attack. What the DNS doesn’t know yet is how many government agencies have been affected. In its directive, they outlined a four-step action plan to address the issue. All government agencies have been given ten (10) business days to complete it.

Security Concerns?

To find out how to secure your organization’s network and mission critical data, contact GDT’s tenured and talented engineers and security analysts at SOC@GDT.com. From their Security and Network Operations Centers, they manage, monitor and protect the networks of companies of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

If you want more information about network security, check out the following articles:

The Collection #1 data breach—sit down first; the numbers are pretty scary

Shutdown affects more than workers

DDoS Attacks will deny a Massachusetts Man Ten (10) years of Freedom

Phishing for Apples

This isn’t fake news

Don’t get blinded by binge-watching

Mo Money, Mo Technology―Taylor Swift uses facial recognition at concerts

Step aside all ye crimes—there’s a new king in town

Q & A for a Q & A website: Quora, what happened?

They were discovered on Google Play, but this is no game

And in this corner…

Elections are in, but there’s one (1) tally that remains to be counted

Hiring A Hacker Probably Shouldn’t Be Part of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

Yes, of course you want 5G! But do you really know why?

By Richard Arneson

It’s an exaggeration to say that the race to 5G is in its final stretch, even though several carriers are claiming it is, and that they’ll be the first to cross the finish line. No question, though, the race heats up more each day, and not just in the U.S. In fact, many in the industry, both here and abroad, believe that China is currently the clear and present leader.

But regardless of who, exactly, currently sits at the top spot, a recent survey conducted by PCMag.com unveils that yes, indeed, the public definitely wants 5G, but they don’t really know why. That’s called good marketing.

Has the marketing of 5G really been that good?

While the carriers—and countries—are mortgaging their futures and advertising budgets on 5G, there’s one (1) thing they appear to have overlooked—explaining what it is and why exactly it’s going to be so earth-shattering. Sure, respondents listed faster speeds as a benefit, but, seriously, who couldn’t have guessed that? Have you ever seen a carrier promising slower speeds and more dropped calls?

The survey, which included 2,500 U.S. consumers, found that four (4) out of five (5) Americans basically have no idea what 5G is, much less what it will provide. A quarter of those who claimed to know what 5G is believe they currently have it. Hmmm. Yes, that means many of the twenty percent (20%) who claim to know what 5G is don’t. And of those who believe they’re currently enjoying 5G, almost half believe they have it at home. Apparently, they’re mixing up 5G Wi-Fi with 5G, an understandable mistake, but incorrect. 5G Wi-Fi has been around for almost twenty (20) years. While it operates in the five gigahertz (5G) range, it is a short range, home networking solution that became popular around 2010 when home routers began utilizing 801.11.

So, what worries consumers about 5G?

If you answered, “higher prices”, you nailed it. But, really, isn’t that answer on a par with “higher speeds”? But give yourself a pat on the back if you answered shrinking data caps, or something to that effect. That ranked just south of higher prices.

While you may have read or heard some of the scare tactics surrounding 5G (or seen them in video form on YouTube), hopefully you’ll fall in line with the vast majority of respondents who aren’t buying it. Eighty percent (80%) stated that they have no safety or health concerns regarding 5G, and that they believe claims to that effect are a bunch of hooey. To allay any fears you may have about 5G, the base stations are not more powerful than current ones, and they’re not solely millimeter-wave. And in the event you’re wondering, millimeter-wave technology has been widely studied, which runs counter to what the technology fearmongers believe and have been disseminating. Concerns surrounding millimeter-wave technology are due to the fact that because it operates at an extremely high frequency, some believe it will splash consumers with steady waves of radiation. Not so.

If you really, really want to know why you (should) want 5G, click here. Oh, and find out how the government is trying to get 5G into your hands quicker here.

Mobility questions? These folks have the answers

If you have questions about your organization’s current mobility strategy (or the one you’d like to implement) and how 5G will affect it, contact GDT’s Mobility Solutions experts at Mobility_Team@gdt.com. They’re comprised of experienced solutions architects and engineers who have implemented mobility solutions for some of the largest organizations in the world. They’d love to hear from you.

The Collection #1 Data Breach─sit down first; the numbers are pretty scary

By Richard Arneson

It’s interesting what forty-five (45) bucks will buy you these days─a small bag of groceries, a night at the movies with your significant other (if you the small-sized drinks and snacks at the concession stand), and half a parking space at a Dallas Cowboys home game. Also, and if you don’t possess a conscience, it can get you three-quarters of a billion unique email addresses.

What happened?

Last week it was revealed by security researcher Troy Hunt that “Collection #1”, an unimaginative name for one (1) of the largest security breaches of all time, is a mass of data—almost 90 Gb worth—that includes 773 million unique email accounts and almost 25 million associated passwords. Yes, passwords.

Originally, the data numbered 2.7 billion records, but Hunt jettisoned the garbage to arrive at its current, apparently marketable total.

Just so there’s no confusion, Hunt is the good guy. For years, he’s been researching data breaches and alerting the public of his findings. He shared his recent, pared-down database with the site Have I Been Pwned?, which allows email addresses to be entered to discover whether they are one (1) of the unlucky 773 million. The bad guy(s) are the ones selling access to the database on a file hosting site that shall remain nameless (sorry, no free advertising for evil).

Collection #1 isn’t a new thing; it’s been around approximately two (2) years. Collection #2 came first, and actually puts its digital progeny to shame. Aside from the fact that it was named by a sequentially-challenged hacker, it totals over 500 Gb. So, if you’re keeping score at home, both collections total almost a terabyte of stolen data that is available to miscreants for the one-time fee of $45. A steal—literally and figuratively.

Hunt does offer up a sliver of solace. While he found his email address in the database, the password associated with it was one (1) he’d used many years ago. Whew. However, even if a password was used for email years ago, you may not be out of the woods. For instance, what if it’s the current password you use to log into another site, like—gulp—your bank. It could be a key that unlocks a spate of services.

Yikes! What next?

First, go to Have I Been Pwned? to discover if you’re an undistinguished member of this hacked fraternity. If so, start changing your passwords—all of them. But don’t change them once and never do it again. We’re supposed to be replacing the batteries in our smoke detectors when daylight savings time ends and begins, right? Add changing passwords into the mix. With the volume of excellent password management tools available, you have sundry options to address this problem. That’s not to say it’s a security panacea, but it can greatly reduce password-related issues.

Security Concerns?

To find out how to secure your organization’s network and mission critical data, contact GDT’s tenured and talented engineers and security analysts at SOC@GDT.com. From their Security and Network Operations Centers, they manage, monitor and protect the networks of companies of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

If you want more information about network security, check out the following articles:

Shutdown affects more than workers

DDoS Attacks will deny a Massachusetts Man Ten (10) years of Freedom

Phishing for Apples

This isn’t fake news

Don’t get blinded by binge-watching

Mo Money, Mo Technology―Taylor Swift uses facial recognition at concerts

Step aside all ye crimes—there’s a new king in town

Q & A for a Q & A website: Quora, what happened?

They were discovered on Google Play, but this is no game

And in this corner…

Elections are in, but there’s one (1) tally that remains to be counted

Hiring A Hacker Probably Shouldn’t Be Part of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

Government Cloud adoption is growing, but at the rate of other industries?

By Richard Arneson

Just so you don’t have to wait for the obvious, let’s just go ahead and get it out of the way─yes, security is the biggest issue for government agencies moving to the cloud. But it hasn’t deterred half of them, according to a year-old Gartner study that states fifty percent (50%) of all government organizations utilize cloud services. Now─eighteen (18) months later─it’s assuredly higher, but we’ll have to wait on their next study for that answer.

Based on our everyday, garden-variety, government-related experiences, it’s easy to assume that government deployments, with the exception of those related to the military or national security, of course, are a little antiquated. Let your mind wander and bring into focus those times you’ve forlornly trudged into the post office, DMV or DPS. It feels like walking back in time. You expect to see signs selling war bonds. Sure, there have been a few updates here and there (you can now pay for auto registrations with a credit card!), but, basically, the processes are not that different than they were thirty (30) years ago. Based on these more frequent interactions with government agencies, it may come as a surprise to learn that, technologically-speaking, they’re definitely not decades behind enterprises, both public and private. In fact, while companies across all industries spend an average of 20.4% of their total IT budget on the cloud, governments, including local, state and federal, clock in a 21.3%.

While security is the top concern, government organizations cite the top two (2) cloud adoption drivers as cost savings and the ability to deliver services more efficiently. Savings and efficiency—sounds about right.

Which cloud are they adopting?

While government cloud adoption is healthy, the three (3) issues throttling it back are security, as previously mentioned, concerns about being locked into a singular vendor, and a lack of the key features they need. This is the reason Gartner opines that the implementation of private clouds by governments will be twice that of public clouds. It’s ironic, government want more features, but are implementing private clouds that inhibit those features associated with public clouds, including functionality, scalability and cost savings. Again, it’s all about security.

Actually, much of what governments consider a private cloud is actually closer to advanced virtualization or an outsourced infrastructure. While both can work perfectly for running particular workloads, they aren’t technically private clouds. Here’s what governments need to know—the benefits gap between the private clouds and public clouds is widening. Another Gartner survey revealed that less than five percent (5%) of what government entities considered a private cloud actually possessed multiple cloud characteristics. That figure makes you wonder if many government IT departments want to say they’re running in a cloud environment, but without actually doing so, at least in a meaningful way. And a poor cloud implementation will likely result in disgruntled users and surly executives. So, ultimately, they’re frustrating users while also failing to achieve and enjoy many, if not most of, the cloud benefits.

Where is the data stored?

Data sovereignty. It will always be important for government entities, whether they like it or not. Data sovereignty simply refers to collected data that is subject to the laws of the country in which it’s collected. Governments are uneasy about storing data outside their borders, which is a concern many agencies share. Recently the Australian government cancelled their cloud contact upon discovering the vendor was processing government data in an offshore cloud. The UK contracted with a cloud provider, but refused to implement its service until the provider had built a local site. Again, security prevails.

Governments, especially local ones, are probably better positioned to take advantage of the cloud than many enterprises. Budgets are repeatedly cut, which certainly makes cost savings an enticing element. But, regardless, those implementing government clouds must consider, in addition to its unique technical, organizational and procedural structures, the regulatory issues that will always sit atop the list of concerns.

Moving to the cloud? It all starts with Expertise―then an Assessment

Migrating to the cloud is a big move; it might be the biggest move of your IT career. If you don’t have the right cloud skill sets, expertise and experience on staff, you may soon be wondering if the cloud is all it’s cracked up to be.

That’s why turning to experienced Cloud experts like those at GDT can help make your cloud dreams a reality. They hold the highest cloud certifications in the industry and are experienced delivering solutions from GDT’s key cloud partners―AWS, Microsoft Azure and Google Cloud. They can be reached at CloudTeam@gdt.com. They’d love to hear from you.

If you’d like to learn more about the cloud, migrating to it, considerations prior to a migration, or a host of other cloud-related topics, you can find them here:

The 6 (correctly spelled) R’s of a cloud migration

Are you Cloud Ready?

Calculating the costs–soft and hard–of a cloud migration

Migrating to the Cloud? Consider the following

And learn how GDT’s Cloud Team helped a utility company achieve what they’d wanted for a long, long time:

A utility company reaps the benefits of the cloud…finally

Smart Sneaks

By Richard Arneson

     But will they make you jump higher?

Living and working in the age of IoT is nothing short of fascinating. The number of new IoT devices created each day overloads the press release wires. And with wide-scale 5G wireless on the horizon, it’s only going to pick up steam. There will be virtually no facet of our everyday lives that isn’t affected by it.

Not to be left out of the IoT buzz, Nike, the manufacturer of all things worn in the name of athletics, introduced its latest smart product to the marketplace at this month’s CES Show in Las Vegas. No, it doesn’t track heart rates or blood pressure; it doesn’t calculate reps or steps. It simply bends over─virtually, of course─to tie your sneakers.

It’s been several years since Nike first suspected that consumers were simply dog-tired of tying their shoelaces. But they’ve taken that suspicion a step further. Their latest iteration of the self-tying sneaker is the Nike Adapt BB shoe, which can be secured, tightened and adjusted with your smart device. It’s the next evolution of two (2) shoes Nike released in 2016: the 89 AirMag, which self-laced and even featured lighted soles (it was a limited-edition shoe based on the ones Marty McFly wore in the 1989 film Back to the Future II), and the HyperAdapt 1.0, which accomplished the same feat, but utilized more traditional laces. Both served as market test balloons. Apparently, they both stayed afloat, at least long enough for Nike to determine the public was ready for a smart sneaker.

While the company promises additional self-lacing shoes will be released later this year, the Nike Adapt BB is currently the only one (1) that can be controlled through a downloaded app. The Adapt BB is, naturally, Bluetooth-enabled and waits at the ready to find out how its owner would like their sneakers laced up before hitting the court. Just think, your basketball shoes will house a tiny motor to cinch down your shoes.

Now for the numbers

The Adapt BB, which made its debut yesterday at two (2) basketball games in Europe, will go on sale in the UK on February 17th. They’re priced at £299.95, or approximately $387 U.S. dollars, and each charge (yes, the sneakers need to be charged) lasts about two (2) weeks.

What Nike hasn’t addressed is what you’d imagine afflicts athletes whose height rivals that of a Redwood─fat-fingering. Imagine LeBron James, who makes almost a half a million bucks per game, sitting out a playoff series because he accidentally over-cinched his Adapt BBs just before his smartphone went dead. Maybe the shoe’s 2.0 version will be able detect if circulation has been cut off.

Get more IoT and Smart City info from the experts

For more information about IoT and Smart City solutions, talk to the experts at GDT. Their tenured, talented solutions architects, engineers and security analysts understand how to design and deploy IoT and Smart City solutions for organizations of all sizes to help them realize more productivity, enhanced operations and greater revenue. GDT helps organizations transform their legacy environments into highly productive digital infrastructures and architectures. You can reach them at IoT@gdt.com. They’d love to hear from you.

You can read more about Smart Cities and IoT Solutions here:

These are no dim bulbs

Why Smart Cities? It’s in the numbers

Five (5) things to consider prior to your company’s IoT journey

Without Application Performance Monitoring, your IoT goals may be MIA

How does IoT fit with SD-WAN?

GDT is leading the Smart Cities Revolution

Late for an appointment? Blame it on a Pole

By Richard Arneson

Remember the days when finding your next appointment meant pulling over, fumbling with a map, then trying to line up the creases before shoving it into your glove compartment? Oh, and if you were running late, finding a phone booth and a quarter held sway over locating a misplaced, winning lottery ticket. Yes, it was a pain, but at least it didn’t change due to our planet’s wonky, wobbly axis. Yes, Earth’s axis moves a lot, and guess what? Modern day navigational tools are affected by it. And, yes, that applies to the online maps we’ve all come to rely on.

In the event you’ve had a hunch our planet is wobbling like a figure skater with an inner ear infection, you’ve been right. Our magnetically-charged north pole is moving, and doing so quickly. Here’s the good news—it’s also moving unpredictably. Nothing says “peace of mind” like hearing Planet Earth is doing something unpredictably. But here’s the rub, at least technologically speaking—navigational models have to continually be updated to account for Earth’s inability to control its wayward axis. NATO counts on the updates; the Department of Defense (DoD) relies on them; and, yes, your handheld device, tablet and computer do, as well.

Isn’t it always about that pesky Liquid Iron?

Far below the Earth’s surface, liquid iron does what liquids do—it sloshes around. And this movement is what causes the Earth’s axis to wobble. Fill a basketball with a pint of water and roll it on the floor. That’s us, trying to carry out our day-ot-day tasks on a lopsided basketball. If navigational systems aren’t adjusted to account for the north pole’s migration, which si currently heading from the Arctic Circle towards Siberia, your navigation system of choice is as precise as a stock picker. The degree to which this will affect your ability to find your next appointment is unclear, but it certainly feels unsettling.

The Earth’s axis is currently moving approximately thirty-four (34) miles per year. Yes, that number is relative, but here’s how it relates─The World Magnetic Model, which measures such things, reports that the Earth’s axis moved about nine (9) miles a year when it was last gauged in 2015 (they measure and update it every five (5) years). So, as you may have guessed based on this four-fold increase in speed, the need to adjust magnetometers (ah, yes, the magnetometers) is becoming increasingly more important. The movement so much faster than in year’s past that they’re recommending updates take place now, instead of waiting until the calendar flips to 2020. Our smart phones are built, at least in reference to their navigational components, based on these magnetometers. And mapping applications we’ve come to rely are dependent on the accuracy of the magnetometer.

Oh, and here’s another thing the government shutdown has, well, shut down—the world magnetic model update, which was supposed to take place today. It won’t happen. Those responsible for the update are cursing Google Maps while circling the block trying to find their upcoming job interview.

If you’re heading to an appointment, you may want to take a map with you—the foldable type.

Mobility Experts with Answers

If you have questions about your organization’s current mobility strategy, contact GDT’s Mobility Solutions experts at Mobility_Team@gdt.com. They’re composed of experienced solutions architects and engineers who have implemented mobility solutions for some of the largest organizations in the world. They’d love to hear from you.

You can read more about how to digitally transform your infrastructure, and organization, here:

Goooooaaaaalll─Technology’s World Cup

Workshops uncover insights into the state of IT and Digital Transformation

What is Digital Transformation?

The only thing we have to fear is…definitely not automation

Without application performance monitoring, your IoT goals may be MIA

When implementing a new technology, don’t forget this word

Automation and Autonomics—the difference is more than just a few letters

Is Blockchain a lie detector for the digital age?

If you fall victim to it, you won’t end up marking it as “like”

They were discovered on Google Play, but this is no game

Blockchain; it’s more than just Bitcoin

When being disruptive is a good thing

Government shutdown affects more than workers

By Richard Arneson

As the government shutdown enters its fourth (4th) week, which marks the longest of its kind in U.S. history, the list of ill effects left in its wake now includes IT security. Yay! Like the impasse between the Trump administration and Democrats that’s at the heart of the shutdown—the border wall—this aftershock also includes a wall. This one (1), however, comes in the form of a digitally-encrypted wall—TLS (Transport Layer Security) certificates.

TLS certificates, like SSL certificates, are utilized by websites to secure connections between users accessing them. For all intents and purposes, TLS is basically the 2.0 version of SSL. Web servers that have installed TLS certificates display their web address with that all important “S” after “HTTP”. Yes, the “S” stands for secure; it means a cryptographic key has been binded to the website, so communications between it and users are encrypted.

So, what does this have to do with the government shutdown?

It’s reported that as many as eighty (80) government websites—those with a .gov domain name—are no longer TLS-protected. Their certificates have expired. And furloughed government IT workers means there’s nobody to renew the certificates. So, trying to access one (1) of these unprotected websites will net you this message—Your connection is not private. As a result, users won’t be able to enter the site. Frustrating, yes, but at least they’ll be kept safe by being prevented from entering. However, in several browsers the warning can be bypassed, which means any sensitive information entered, such as social security numbers, won’t be encrypted. If more advanced and adventurous users decide to take this route, they could open themselves up to man-in-the-middle attacks, in which cyber criminals eavesdrop on conversations in the name of ill-gotten gains.

Naturally, the longer the shutdown, the more sites will be affected. In just over three (3) weeks, eighty (80) certificates have expired. And those eighty (80) sites represent only two percent (2%) of all federal .gov sites. Yikes.

Security Concerns?

To find out how to secure your organization’s network and mission critical data, contact GDT’s tenured and talented engineers and security analysts at SOC@GDT.com. From their Security and Network Operations Centers, they manage, monitor and protect the networks of companies of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

If you want more information about network security, check out the following articles:

DDoS Attacks will deny a Massachusetts Man Ten (10) years of Freedom

Phishing for Apples

This isn’t fake news

Don’t get blinded by binge-watching

Mo Money, Mo Technology―Taylor Swift uses facial recognition at concerts

Step aside all ye crimes—there’s a new king in town

Q & A for a Q & A website: Quora, what happened?

They were discovered on Google Play, but this is no game

And in this corner…

Elections are in, but there’s one (1) tally that remains to be counted

Hiring A Hacker Probably Shouldn’t Be Part of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

Distributed Denial of Service (DDoS) Attacks will deny a Massachusetts man ten (10) years of freedom

By Richard Arneson

Unless you consider ten (10) years in the hoosegow some odd form of payment, it’s widely known that (all together now) crime doesn’t pay. Martin Gottesfeld, 34-year-old Massachusetts knucklehead, just got reminded of it the hard way, though. He got sentenced to a decade behind bars for cyberattacking two (2) medical facilities on behalf of Anonymous, a hacking activist group (What happened to the days when activist groups tried to get people to vote more or advance women’s rights?). The cyberattacks were launched to protest the treatment of a teen in a high-profile custody case. The sentence was handed down on January 7th, four (4) months after a federal jury found him guilty on two (2) counts, including conspiracy to damage protected computers. His cyberattacks, which occurred in 2014, targeted Boston Children’s Hospital and another nearby medical facility.

Gottesfeld, a computer engineer who hails from the Boston suburb of Somerville, MA, dreamed up his attack after learning about a child custody case involving a teenage girl. Gottesfeld shared the views of several political and religious groups who decided the government’s interference in the case unjustly trumped parental rights.

The teenager, Justina Pelletier, had been taken into custody by the state of Massachusetts after it determined her parents, who insisted their daughter’s health issues were not psychiatric in nature, were interfering with her treatment. Gottesfeld, whose information about the case came from news stories, decided the hospital had misdiagnosed Pelletier. He determined the best way to combat a faulty diagnosis was to launch DDoS attacks on Boston Children’s Hospital and Wayside Youth & Family Support Network, where Pelletier resided after being discharged from the hospital. Gottesfeld’s attack on the hospital disrupted its network for almost two (2) weeks, and interrupted several services used to treat patients.

While there has been no reported connection with Gottesfeld’s DDoS attacks, just three (3) years ago he was found floating off the Cuban coastline in a motor-challenged boat. He was rescued by a Disney Cruise ship. While it’s unclear if a large mouse or actual crew member led the rescue efforts, they soon learned that they’d pulled aboard an honest-to-goodness fugitive from justice. As it turned out, Gottesfeld had recently fled the United States upon learning he was the target of a federal investigation. Apparently, Gottesfeld is as poor at selecting boats as he is at cyber-crime.

In addition to giving up a hundred and twenty-one (121) months of freedom, Gottesfeld is required to pay almost $450,000 in restitution, an especially steep price considering he’ll be making about fifty (50) cents an hour until 2029. He should be getting fairly used to living behind bars, though. He was originally taken into custody almost three (3) years ago. Naturally, he has plans to appeal his conviction, but insists he has no regrets.

Security Concerns?

To find out how to secure your organization’s network and mission critical data, contact GDT’s tenured and talented engineers and security analysts at SOC@GDT.com. From their Security and Network Operations Centers, they manage, monitor and protect the networks of companies of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

If you want more information about network security, check out the following articles:

Phishing for Apples

This isn’t fake news

Don’t get blinded by binge-watching

Mo Money, Mo Technology―Taylor Swift uses facial recognition at concerts

Step aside all ye crimes—there’s a new king in town

Q & A for a Q & A website: Quora, what happened?

They were discovered on Google Play, but this is no game

And in this corner…

Elections are in, but there’s one (1) tally that remains to be counted

Hiring A Hacker Probably Shouldn’t Be Part of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

What to consider before hitting the SD-WAN open road

By Richard Arneson

Estimates vary (greatly), but industry analysts predict the SD-WAN market will clock in somewhere between $4.5 billion and $9 billion by 2022. Yes, that’s quite a variance, but whether it finishes at the high or low end, the growth is staggering considering it’s now just shy of $1 billion. Yeah, it’s big, and why wouldn’t it be? SD-WAN can provide significant cost savings, faster and easier location turn-up with automated authentication and configuration, higher speeds and more bandwidth. And because it can be centrally managed through Controller software, it’s far easier to manage security, control policies and compliance. Who wouldn’t want some of that action?

But before you dive in, there are a number of things to consider, and prepare for, prior to starting your SD-WAN journey. That path may look like the Yellow Brick Road, but, as Dorothy and her inept buddies found out, it can be rife with danger. But forget the witches and flying monkeys. This is the real world, where outages, unbudgeted costs, and losses in productivity and revenue are a lot scarier.

Following are a few of the elements to carefully consider prior to hitting the SD-WAN road:

Migration

It’s important to consider which sites to migrate first, the broadband service providers to use, application response times and which ones should be moved to the cloud, personnel needing to be involved, and how the overall experience of end users will be affected. In addition, think about how many physical appliances will continue to be utilized, which should be virtualized, and what a hybrid of the two (2) would look like.

Most Common Challenge

One (1) of the most daunting aspects of moving to an SD-WAN architecture concerns which vendor to use. Currently, there are almost fifty (50) from which to select, and that’s not to mention the SD-WAN solutions offered by most of the larger service providers. Ever hear of paralysis by analysis? Selecting the “right” vendor for your needs can cripple the collective heads of IT personnel. It’s a big decision, and one (1) you’ll be living with for awhile.

SD-WAN Pre-Deployment

Start with a baseline that takes into account your current environment. A journey begins with a single step, but that step can’t be taken until you know where to start. The current state of your network architecture should include an inventory of all assets, users, applications and network paths and connections. And consider how SD-WAN will play with your current infrastructure. Will a hybrid of the two (2) work—and how well?— during the transition?

Deploying SD-WAN doesn’t have to mean you’ll automatically bid MPLS a fond farewell. Maintaining MPLS and utilizing SD-WAN to augment it may be the best option. And consider how cloud services you utilize, or plan to, will get incorporated during the deployment.

Deployment―the first step

SD-WAN deployment needs to be immediately (or close to it) validated after cut-overs. Cut your teeth on a few smaller “pilot sites”, and learn from those experiences. This is the ideal time to analyze and verify that intended policies are being adhered to by the SD-WAN controller, and that applications are performing as intended. Again, learn from these pilot sites; doing so will help future implementations go more smoothly.

Ongoing Management and Visibility

If you thought one (1) of SD-WAN’s benefits is that you can simply set it and forget it, you need to wipe that notion from your mind. If not, you’ll hamstring your SD-WAN deployment. Even though SD-WAN controllers continually make policy decisions based on the network’s state and select dynamic paths for applications, that doesn’t mean IT staffers can kick back and play Fortnite. The state of the network needs to be continually monitored for policy exceptions; ignore this key component and applications’ performance will deteriorate.

Also, SD-WAN isn’t an island unto itself. It needs to interoperate with other areas of the network, and without insight into this, and how it’s impacting your organization, your SD-WAN deployment won’t deliver the desired results.

High visibilty to carefullyl manage an SD-WAN deployment is critical; it’s what will ultimately make it or break it. And from this, you can better understand the impact it is delvering and will continue to provide to your organization. If critical performance measurements are evaluated on an on-going basis, it will help with troubleshooting, proactive alerting and policy optimization.

Got questions? Call on the SD-WAN experts

To find out more about SD-WAN and the many benefits it can provide your organization, contact GDT’s tenured and talented SD-WAN Engineers and Solutions Architects. They understand the many factors that must be considered prior to SD-WAN deployments, including link optimization, broadband usage, network architecture and the impact of moving on-prem infrastructures to the cloud. And they work with a wide array of SD-WAN providers. They’ve implemented SD-WAN solutions for some of the largest enterprise networks and service providers in the world, and helped them optimize their ROI. They’d love to hear from you.

Get more SD-WAN information here, including: 

Dispelling the myths surrounding SD-WAN, 

How SD-WAN fits with IoT

Demystifying SD-WAN’s overlay and underlay networks

SD-WAN’s relationship with SDN, and 

Why the SD-WAN market will grow by 1200% by 2021

And to see how GDT’s SD-WAN experts delivered the perfect solution to a global software company, click here.

These are no dim bulbs

By Richard Arneson

Smarter than your average bulb

What do you use dozens, perhaps hundreds, of times a day without taking notice? Nope, it’s not your smart phone, TV or refrigerator…the light bulb. Yes, the light bulb, that ignored, often cursed (when the filament disintegrates at the most inopportune time) staple of every day life. Can we comprehend life without it? Imagine reaching for the kerosene canister upon entering your home at night, or patting down nearby surfaces for a match to light the lamp. No, thanks. But soon that taken-for-granted glass bulb could be providing more than simply a means to light your path. It will become an integral part of your connection to the outside world.

Li-Fi, short for Light Fidelity, is a Visible Light Communications (VLC) system that utilizes the very light you use to illuminate the pages of your favorite novel to transmit wireless data. Say what? Yes, the data is embedded in the light beam. Li-Fi enabled devices convert those beams into electrical signals, which, in turn, gets converted back to data. The term Li-Fi was coined by German physicist Harald Haas in 2011, when he envisioned the idea of using light bulbs as, essentially, wireless routers. [Click here to watch Haas’ 2011 TED Talk on the subject]

How it works

Li-Fi bulbs contain a chip that modulates visible light between 400 THz and 800 THz for optical data transmission. Photoreceptors receive the data. The beam of light flickers and dims imperceptibly to the human eye, all while being converted into electrical signals by the receiver. That signal is converted back to binary data . Researchers claim Li-Fi can transmit data up to 100 times faster than that of current, radio wave-based Wi-Fi networks. Just think, while the bulb is lighting the pages of your latest Sports Illustrated, it can also be downloading several feature length films (HD, no less) in a few seconds. It’s a cheaper, optical version of Wi-Fi.

Unlike Wi-Fi, Li-Fi doesn’t interfere with radio signals transmitted by, for instance, nearby access points, such as routers. And they claim Li-Fi is more secure than Wi-Fi because, well, it’s easier to block out light. That’s right, pulling down the curtains will quash connectivity. Try blocking radio signals with venetian blinds.

Yes, there are some limitations

Li-Fi coverage is limited to ten (10) meters; Wi-Fi, conversely, will travel over three times (3x) that distance. And Li-Fi won’t work while you’re sunning by the pool. It’s limited to indoor use, although researchers claim it can work outdoors, but only if it’s a gray, dismal day. But, seriously, how much fun is that? And, yes, if the light is obscured, connectivity is lost. It won’t work through walls, blinds or curtains. If the light can’t get to the Li-Fi receiver, data transmission comes to a grinding halt.

What could Li-Fi mean to IoT?

The mere thought of Li-Fi can flood the mind with ideas and possibilities about how it could be used. But those same minds are faced with its limitations. So, more than likely, it will serve, when it’s more widely available, as an adjunct to Wi-Fi. But, again, those speeds! Currently, lighting titan Phillips offers full and integrated Li-Fi products and services. And if Phillips has invested in it, can widespread use be far behind?

With more and more IoT devices being introduced daily, the need for faster, more ubiquitous connectivity will reign supreme. And it appears Li-Fi might fit the light bill.

Get more IoT and Smart City info from the experts

For more information about IoT and Smart City solutions, talk to the experts at GDT. Their tenured, talented solutions architects, engineers and security analysts understand how to design and deploy IoT and Smart City solutions for organizations of all sizes to help them realize more productivity, enhanced operations and greater revenue. GDT helps organizations transform their legacy environments into highly productive digital infrastructures and architectures. You can reach them at IoT@gdt.com They’d love to hear from you.

You can read more about Smart Cities and IoT Solutions below:

Why Smart Cities? It’s in the numbers

Five (5) things to consider prior to your company’s IoT journey

Without Application Performance Monitoring, your IoT goals may be MIA

How does IoT fit with SD-WAN?

GDT is leading the Smart Cities Revolution

Phishing for Apples

By Richard Arneson

It’s impossible to know what fish will bite on, which is probably why people are drawn to fishing. Sure, it seems a little odd that human beings take pleasure in outwitting something with a pea-sized, waterlogged brain, but, yes, it’s fun. One (1) day fish like cheese, the next day, salmon eggs. Sometimes they like to bite in the morning, then, the next day, they collectively decide that it’s better to nosh on worms at dusk. Phishing, like fishing, is no different. It’s about changing bait. When emails aren’t producing the desired results, scammers dangle something else from the hook. And in one (1) of the more recent phishing scams, that’s exactly what they did. Their new bait is phone calls, and the targets are iPhone users.

If you have a cell phone, you’ve no doubt been inundated with a staggering number of spam calls of late, which dodge call blocking by displaying different phone numbers each time they’re placed. Also, along those lines, they’re impervious to the Do Not Call Registry. And by fooling recipients with the use of local area codes (known as “Neighborhood Spoofing”), the calls get picked up more readily. Don’t be ashamed if you’ve answered, or been fooled by, one (1). During an NPR interview, FCC chairman Ajit Pai stated that “every now and then, even on my work Blackberry, I’ll see a call that seems to be coming from the 202 area code, which is here in Washington. And I know for a fact that, you know, it’s probably not someone calling from the office. Sometimes, I answer just for the heck of it. And, lo and behold, I’ve won a vacation.”

That’s not to suggest, however, that all spam calls exist to phish. Sure, they’re annoying as mosquitoes at a concert in the park, but most aren’t designed with evil intentions. However, this one (1) is. And what has made it especially deceptive is that the number revealed to the recipient contains the Apple logo, including its correct address and corporate phone number. It looks like the real deal.

The scam was revealed last week after an IT security firm reported that they’d received an automated call stating multiple Apple servers maintaining Apple IDs had been compromised. They were directed to contact a supplied toll-free number immediately to clear up the matter (there’s your red flag). But what’s more disconcerting is that when the security firm contacted Apple’s support number to report the scam (using the same number that the scammers presented), neither AT&T, its wireless provider, nor Apple could differentiate between the two (2) numbers. In other words, the fake call was indexed in their “Recent Call(s)” list as Apple’s legitimate customer support number.

If you’re sick of spam calls, this may make you seriously ill

First Orion, an Arkansas-based company that provides caller ID and call blocking solutions, recently published a report that predicts almost half of all mobile phone calls in 2019 will be spam. Ouch. They claim to have analyzed over 50 billion calls that were placed over the past eighteen (18) months. By combining call patterns and behaviors with other attributes, First Orion arrived at this startling, but hopefully errant, prediction.

Security Concerns?

To find out how to secure your organization’s network and mission critical data, contact GDT’s tenured and talented engineers and security analysts at SOC@GDT.com. From their Security and Network Operations Centers, they manage, monitor and protect the networks of companies of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

If you want more information about network security, check out the following articles:

This isn’t fake news

Don’t get blinded by binge-watching

Mo Money, Mo Technology―Taylor Swift uses facial recognition at concerts

Step aside all ye crimes—there’s a new king in town

Q & A for a Q & A website: Quora, what happened?

They were discovered on Google Play, but this is no game

And in this corner…

Elections are in, but there’s one (1) tally that remains to be counted

Hiring A Hacker Probably Shouldn’t Be Part of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

This isn’t fake news

By Richard Arneson

This past Saturday, December 29th, while you were trying to figure out which of the forty-one (41) bowl games you should tune in to, a cyberattack hit the Los Angeles Times and a number of Tribune-owned newspapers, including the Chicago Tribune and Baltimore Sun. Readers hoping to sit down with a steaming Cup of Joe and the sports section were soon disappointed when their paper didn’t arrive. The attack crippled the distribution of newspapers that utilize the same production platform.

An unnamed source states that the attack appears to have originated outside the U.S., but any additional information supporting those findings haven’t been disclosed.

The malware was initially detected the day before by the Tribune-owned Orlando Sentinel, but doesn’t appear to have affected any credit card or personal information of their, or any of the other papers’, subscribers. Only back-office systems were disrupted, and none of the newspapers’ websites were affected. It appears that the attack wasn’t launched to steal data, just take over servers and ruin readers’ Saturday morning.

The attack crippled some newspapers more than others. The San Diego Union Tribune appears to have taken the biggest hit; almost ninety percent (90%) of their Saturday editions didn’t get delivered.

Included in the attack was the New York Daily News and the West Coast editions of the Wall Street Journal and The New York Times.

The Department of Homeland Security (DHS) is looking into the event, but neither they, nor the FBI, have issued any comments related to the attack.

Insult to Injury

Newspaper-attacking malware seems especially vicious considering what’s happened to the industry over the past twenty (20) years. At one (1) time, there were few media jobs that carried more cachet than that of a newspaper reporter. Just ask Woodward and Bernstein. The Internet has changed all of that, of course. To make matters worse, newspapers were slow to re-invent themselves in the digital age. According to a report from the Pew Research Center released six (6) months ago, employment in newsrooms has fallen by twenty-three percent (23%) in the past nine (9) years. And the largest newsrooms hit were in the newspaper industry.

Security Concerns?

To find out how to secure your organization’s network and mission critical data, contact GDT’s tenured and talented engineers and security analysts at SOC@GDT.com. From their Security and Network Operations Centers, they manage, monitor and protect the networks of companies of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

If you want more information about network security, check out the following articles:

Don’t get blinded by binge-watching

Mo Money, Mo Technology―Taylor Swift uses facial recognition at concerts

Step aside all ye crimes—there’s a new king in town

Q & A for a Q & A website: Quora, what happened?

They were discovered on Google Play, but this is no game

And in this corner…

Elections are in, but there’s one (1) tally that remains to be counted

Hiring A Hacker Probably Shouldn’t Be Part of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

Don’t get blinded by binge-watching

By Richard Arneson

“Don’t make me tell you again. Don’t click on the links.”

If you’re like most people, you may have used the holiday break for something to augment all of the shopping, eating, getting together with family and friends, eating, sleeping in late, and eating. While you binged on food, there’s a greater than average chance you binged on something else―a show, or shows, that you’ve wanted to see for some time. And there’s no better medium on which to binge-watch than Netflix. But the Federal Trade Commission (FTC) wants you to know that if you have a Netflix account, you better watch out. They’re not talking about Santa. No, they’ve issued a phishing alert.

One (1) month ago, police in Ohio took a screenshot of the Netflix phishing scam and alerted the FTC, which issued its warning thirty (30) days later, on December 26th. The FTC was a little behind the curve, though―the UK’s Action Fraud service issued a similar warning a few months back.

Here’s how it works

The Netflix phishing scam is your basic, garden-variety attempt to access customer accounts to garner sensitive info, such as, naturally, credit card information. And they’re hoping to find Netflix customers who have committed the cardinal sin of logging on—using the same user name and password for other accounts. Yes, the thought of keeping up with unique logins and passwords for the dozens of sites you frequent sounds daunting, but there are some great products on the market that will help you manage them and keep you safe.

The Netflix phishing targets receive an email that, of course, appears to have been sent by Netflix. They claim that your Netflix account has been put on hold, which means you won’t be able to access the 9th season of The Walking Dead. They’re hoping fans will perform a panic-click on the link to re-enter information that will unlock Rick Grimes’ final episodes. The message declares that Netflix is “having trouble with your current billing information” and it needs to be re-entered (Tip: when you see the word “billing”, please be careful.).

The Netflix phishing screenshot below is unlike many others because it looks fairly good. There are no horribly misspelled words, misplaced punctuation marks, or grammar suggesting the writer is totally unfamiliar with the English language. Yes, the “Dear” salutation is pretty weird, but it might slip past an undiscerning reader.

If you receive a Netflix phishing email, you can report it to the FTC at ftc.gov/complaint. And it’s a good idea to forward the email to Netflix at phishing@netflix.com.

Here are some Quick Tips on how to protect yourself against Phishing Scams

  • Utilize an anti-virus product that can detect fraudulent and malicious websites, or what they may refer to as anti-phishing technology in their marketing materials.
  • Type in the URL of the retailer’s website. This will ensure you’re heading to the right place. I know, it’s easier to click on the link, but typing it in will only cost you a few additional seconds.
  • If you’re ever questioning a site’s authenticity, type in a fake password. If it’s accepted, trouble’s lurking—they’ll accept anything for the password. Close it out and delete your browsing history.
  • Also, regularly inspect your credit card and bank statements. It’s not fun reading, or an activity you’ll look forward to, but careful inspection is one (1) of the best medicines.
  • When you see all CAPS in the subject line, you’ve probably received a phishing email. Why scammers like ALL CAPS is unclear, but it’s a common practice.
  • Check that the e-commerce site you’re visiting begins with https://, not http://. The S is for Secure, meaning all communications between you and the website are encrypted.
  • Look for misspelled words or really, really poor grammar. You won’t need an English degree to spot it—it’ll dramatically stand out.
  • If you’ve entered a site and the images are of poor quality or low resolution, you’re probably on a fraudulent site. You won’t see butchered images on the websites of reputable retailers.
  • Hover your mouse over links embedded anywhere in the email. If the link address looks odd and/or doesn’t represent the proper company, don’t click on it.

Security Concerns?

To find out how to secure your organization’s network and mission critical data, contact GDT’s tenured and talented engineers and security analysts at SOC@GDT.com. From their Security and Network Operations Centers, they manage, monitor and protect the networks of companies of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

If you want more information about network security, check out the following articles:

Mo Money, Mo Technology―Taylor Swift uses facial recognition at concerts

Step aside all ye crimes—there’s a new king in town

Q & A for a Q & A website: Quora, what happened?

They were discovered on Google Play, but this is no game

And in this corner…

Elections are in, but there’s one (1) tally that remains to be counted

Hiring A Hacker Probably Shouldn’t Be Part of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

“Gooooooaaaal”―Technology’s World Cup

By Richard Arneson

In the event you’re unaware, there is a World Cup currently taking place. These matches, however, aren’t being played on a football pitch, but in offices, data centers and engineering labs the world over. The coveted World Cup of 5G pits dozens of service providers and countries for the right to claim that they are the first to offer this long-awaited wireless evolution. While some may refer to 5G as an upgrade from 4G, that’s a gross understatement. That would be like calling a 1964 Corvette an upgrade from a Ford Model T. What 5G will provide when compared to 4G will require words like, “order of magnitude.” (Read more about it here.)

While it’s unclear exactly who’s leading the pack (not surprisingly, most carriers, or countries, claim they’re in the lead), you have to place one (1) at, or near, the top—China. In fact, global consulting firm Deloitte has no qualms about naming China as the leader in the clubhouse. In its 5G study published earlier this year, they unequivocally declare China the leader in the 5G race. According to the study, since 2015 there have been roughly 30,000 new cell sites built in the U.S. During that period, China has constructed over ten times (10x) that amount—350,000.

Based on the number of mobile subscribers and size of network, China is miles past its nearest competitors, India and the United States. And it’s apparent the Chinese government views anything but a gold medal in the 5G race to be an embarrassment. China identified 5G as the highest priority in its multi-faceted technology roadmap, which they entitled, “Made in China 2025.” They have worked furiously to set global technical standards. They want to win it—badly. And that determination is raising a few hairs on the backs of Western governments’ necks. Sure, they don’t want to lose the race, but they fear what China’s focus on breaking the 5G tape first may mean to national security.

A balloon that promptly deflated

In February of this year, national security staffers in the Trump administration proposed a very un-Republican move—nationalization of a wireless network. In the proposal, which could be better described as a trial balloon, the federal government would build and pay for a national 5G network from which service providers could lease. It was a rather odd idea; fighting China’s 5G deployment with a very China-like proposal―government ownership.

The balloon barely got off the ground. All four (4) major U.S. wireless carriers (AT&T, Verizon, T-Mobile and Sprint) vehemently opposed wireless nationalization; it ran a tad counter to the notion of having a competitive advantage. And the Federal Communications Commission (FCC), which is led by Trump-appointed Chairman Ajit Pai, didn’t take a shine to the idea, either. In a written statement, Paj said, “The main lesson to draw from the wireless sector’s development over the past three decades — including American leadership in 4G — is that the market, not the government, is best positioned to drive innovation and investment.”

But don’t think the FCC simply turned its back on U.S. carriers’ participation in the great race. In what was seen by many as a “make-up call,” on August 2nd the FCC voted on rules they termed OTMR (One Touch Make Ready), which are designed to hasten the rollouts of 5G networks. These rules address the strict, cumbersome laws in place that specify required distances that must separate network elements attached to a pole—usually a telephone poll. It’s not so much the distances that are the problem, but what’s involved in getting them approved on a pole-by-pole basis (See government red tape).

The U.S. government wasn’t finished, however. They continue to increase pressure on allies to boycott products from China’s largest telecom manufacturers, Huawei and ZTE. It isn’t as much a tactic to enhance U.S. carriers’ 5G deployment, but one (1) that would hit both squarely in the wallet (the boycotts almost buried ZTE). But Huawei sees this as a vindictive step that will only serve to hinder the United States’ 5G market. According to one (1) of Huawei’s chairmen, Eric Xu: “For Huawei, as leader in 5G technology, we don’t have the opportunity to serve the US consumer with 5G solutions and services, then the US market is a market without full competition while still blocking leading players from participation. Now, I’m not sure whether they can really deliver their objective of becoming the world’s No. 1 in 5G.”

The Final Match

It appears that crowning a 5G champion has been whittled down to a two (2) country race. But, unlike its football counterpart, this World Cup winner will likely be left up to interpretation.

Mobility Experts with Answers

If you have questions about your organization’s current mobility strategy (or the one you’d like to implement) and how 5G will affect it, contact GDT’s Mobility Solutions experts at Mobility_Team@gdt.com. They’re composed of experienced solutions architects and engineers who have implemented mobility solutions for some of the largest organizations in the world. They’d love to hear from you.

You can read more about how to digital transform your infrastructure here:

Workshops uncover insights into the state of IT and Digital Transformation

What is Digital Transformation?

The only thing we have to fear is…definitely not automation

Without application performance monitoring, your IoT goals may be MIA

When implementing a new technology, don’t forget this word

Automation and Autonomics—the difference is more than just a few letters

Is Blockchain a lie detector for the digital age?

If you fall victim to it, you won’t end up marking it as “like”

They were discovered on Google Play, but this is no game

Blockchain; it’s more than just Bitcoin

When being disruptive is a good thing

“Rudolph, with your GPS…”

By Richard Arneson

If you really want to get into the Christmas spirit, catch a quick flight to Finland, where the reindeer roam the countryside like squirrels at a city park. It’s estimated that there are as many reindeer as there are people in Finland, and if your livelihood depends on the animal, herding them throughout the country’s northern region is a bear. That is until reindeer will soon be given a label that’s never been used to describe them—smart.

Something else that’s not associated with reindeer—IoT

Finland’s Reindeer Herding Association has been working with Digita, a Helsinki-based communications company, to design a reindeer collar that uses GPS and a Digita’s long-distance wireless network. In Lapland, Finland’s northernmost region, there over fifteen hundred (1,500) reindeer herders who rely almost solely on the animal to provide their living. It doesn’t sound like a lot, but considering the region is home to only a hundred and eighty thousand (180,000) Finns, it’s not an insignificant number. They rely on reindeer for milk, meat and fur, and the reindeer trade dates back hundreds of years. The profession is revered there.

Herders’ high cost of loss

At least five thousand (5,000) reindeer are killed each year, and most of their remains are never found. While the cause of death is primarily due to four-legged predators from nearby Russia—mainly wolverines and lynx—without a body the herders can’t collect a significant stipend from the Finnish government to cover their losses. And when herders lose, on average, roughly ten percent (10%) of their herd each year, that’s a decent chunk of Finnish change.

Herders who decide not to deploy the Internet-connected collars will continue to do what their forbearers have for centuries—roam the vast, snow- and forest-covered countryside on foot to count their herd one (1) Blixen at a time.

Each collar is fitted with a GPS device that’s approximately the size and shape of a deck of cards. They’re hoping to get it down to the size of a quarter—an American quarter―and jettison the collar altogether. Instead, it will be attached to the reindeer’s ear.

If a collared reindeer doesn’t move for four (4) hours, the herder is alerted of their inactivity. If they’ve been felled by a predator, the herder knows exactly where to find their remains. And with the remains comes that government payment. In addition, use of GPS to track and herd reindeer has reduced the need to employ as many workers. Their largest stumbling block had been battery life, but their latest prototype lasts for approximately one (1) year. It’s estimated that the device will cost about a hundred American dollars, but tracking large herds won’t require each reindeer be given a collar. They’re counting on attaching only one (1) to the herd’s female leader, which will allow them to know the whereabouts of the entire herd (apparently, they don’t stray far from the female in charge).

In addition to simplifying reindeer herding, reducing costs and increasing revenue, the Reindeer Herding Association hopes bringing IoT technology into the mix will help provide future generations incentive to carry on the occupation’s longstanding Finnish tradition.

Internet of Things (IoT) questions?

GDT’s tenured, talented solutions architects, engineers and security analysts understand how to design and deploy IoT—including Smart City—solutions for enterprises, service providers, government agencies and cities of all sizes to help them enjoy more productivity, enhanced operations and greater revenue. GDT helps organizations transform their legacy environments into highly productive digital infrastructures and architectures. You can reach them at SolutionsArchitects@GDT.com or at Engineering@GDT.com. They’d love to hear from you.

You can read more about IoT and Smart City Solutions below:

Farmers may soon have a new, hard-working friend

Why Smart Cities? It’s in the numbers

Five (5) things to consider prior to your company’s IoT journey

Without Application Performance Monitoring, your IoT goals may be MIA

How does IoT fit with SD-WAN?

GDT is leading the Smart Cities Revolution

Who doesn’t want to modernize?

By Richard Arneson

COBOL. Remember it? Haven’t heard that name in a while, right? No, it hasn’t gone the way of Novell Networks or Blockbuster Video. Even though it dates back to the 1950’s, it’s still used, and used widely, today. In fact, according to InformationWeek, over seventy percent (70%) of business transactions are still processed by COBOL; or, as it’s never referred to as, Common-Oriented Business Language. But if you do hear “COBOL”, know this—it can’t be combined with the word modernization.

Application modernization allows organizations to advance legacy apps into ones that are more nimble, can reduce costs and, better still, free up time so IT staff can work on more forward-thinking, business-changing initiatives. But application modernization doesn’t refer to ripping out legacy apps and building new versions from the ground up. It’s like a frame up car restoration—the bones stay, but the application gets an overhaul to be more cloud- and mobile-friendly. The degree to which an application needs to be modernized can vary greatly. Some may require a heavy dose of re-coding, while others may need less invasive upgrades. For instance, IBM still utilizes applications that were written for mainframes fifty (50) years ago; as you might imagine, modernizing those applications would require a considerable amount of work.

If you’re looking at modernizing applications, consider the following:

Don’t be a wallflower

Don’t be shy. Solicit information, and lots of it. Get out there and talk to applications’ users. There’s no such thing as too much feedback. If you give this consideration short shrift, you’ll be unnecessarily complicating your modernization goals. Talking to users will provide you with a wealth of information, including issues you won’t find in documentation. They’ll probably be able to relay golden nuggets of information that has never been considered by the IT staff. Without this level of upfront education, re-coding will be required, deployment will be delayed, and architects will lose their senses of humor.

If it ain’t broke…

There’s no law that requires organizations to modernize applications. Remember, IBM is successfully using legacy applications that perform as well today as they did when LBJ was president. Don’t assume all applications need to be “modernized.” Don’t get hung up on that word.

…but if it is

If you’ve determined that particular applications can no longer be saddled as “legacy”, make sure to look for any redundant code in them prior to attempting to make them mobile-, cloud- or digital-ready. This code-level detective work, if done properly and comprehensively, can reduce costs and enhance efficiencies.

Map how data flows throughout the organization and your migration will thank you for it

One (1) of the biggest challenges facing application modernization involves understanding how data is represented in legacy applications and those with which it integrates. In other words, understand applications’ interdependencies with other apps, including which data flows to, and from, them. And, of course, know how it flows. Without comprehensively mapping how data flows through your organization, count on costs going up and efficiency going down.

Once data flows are understood and documented, they need to be scrubbed clean and standardized. If you skip this step, “garbage in, garbage out” will soon be ringing through the hallways. And standardizing data (a great use of automation) means applications agree on what is, and isn’t, considered data. The result? Applications can more easily sync with other applications, which helps the data migration progress more smoothly.

Automate Testing

Another great use of automation is in the testing phase, and automating processes is both a time saver and a headache reliever. Speed aside, it’s simply a more accurate and objective way to handle application testing.

Partition data to optimize performance

Data volumes are increasing exponentially, which makes partitioning it all the more important if you’re at all interested in performance optimization (trust me, you are). Partitions can be constructed through automation to separate data, so only the data needed can more easily be accessed and managed. And it simplifies data archiving.

Questions about application and data center modernization?

For more information about how your organization can develop or enhance its road to digital transformation, call on the expert solutions architects and engineers at GDT. For years they’ve been helping customers of all sizes, and from all industries, realize their digital transformation goals by designing and deploying innovative, cutting-edge solutions that shape their organizations and help them realize positive business outcomes. Contact them at SolutionsArchitects@gdt.com or at Engineering@gdt.com. They’d love to hear from you.

You can read more about how to digitally transform your infrastructure here:

Workshops uncover insights into the state of IT and Digital Transformation

What is Digital Transformation?

The only thing we have to fear is…definitely not automation

Without application performance monitoring, your IoT goals may be MIA

When implementing a new technology, don’t forget this word

Automation and Autonomics—the difference is more than just a few letters

Is Blockchain a lie detector for the digital age?

If you fall victim to it, you won’t end up marking it as “like”

They were discovered on Google Play, but this is no game

Blockchain; it’s more than just Bitcoin

When being disruptive is a good thing

Mo Money, Mo Technology

By Richard Arneson

Facial recognition has come a long way since the 1960’s, when a man named Woodrow Wilson Bledsoe (great name, right?) took measurements between key, primary facial elements, laid those coordinates out on grid-lined tablet, and, ultimately, could replicate subjects’ faces. His tablet—known as a RAND tablet—was actually a screen similar to an iPad, and Bledsoe would enter the measured lines on the tablet with a stylus, which emitted electromagnetic pulses. Those pulsed lines would enter a database maintaining hundreds of etchings. When Bledsoe’s system was “shown” a picture of the subject, it would rifle through the database and pick out—when it worked—the associated electronic sketch.

It was roughly twenty (20) years later when facial recognition, at least the bones of how it currently works, was invented. Called the Eigenface approach, it utilized linear algebra to automatically detected faces from images. Distances between facial features were automatically assigned coordinates, which were then submitted to a database. Once there, the magic would happen and soon cough up the individual in question.

For fighting crime, it’s a natural

Facial recognition was first used in crimefighting at the 2002 Super Bowl, where it helped authorities nab a few petty criminals. More significant, though, was the spate of false positives it reported. Imagine spending thousands of dollars on Super Bowl tickets, only to be ushered aside upon entry and told that a machine suspects you’re part of an international crime syndicate. Fighting crime with facial recognition was getting close, but it wasn’t quite there. There was considerable work to be done, and they did it—lots of it. And who could have predicted that sixteen (16) years and billions of dollars later it would be used to help keep a megastar safe.

There are a lot of creeps out there

Perennial pop chart topper Taylor Swift, with album sales of well over 35 million and a gazillion dollars in the bank, is probably one (1) of facial recognition technology’s biggest fans. It was recently disclosed by Rolling Stone that at her Rose Bowl concert last May, well-disguised facial recognition kiosks were set up to detect the very worst of her fans—stalkers.

The kiosks baited the creeps by displaying sundry information about Swift, including rehearsal clips, so anybody who decided to peruse them was scanned. Those images were sent to a command center, of sorts, located in Swift’s hometown of Nashville, Tennessee. From there, each scanned picture was cross-referenced against a considerable list of Swift stalkers (there are hundreds). There were several who weren’t scanned, however; they’re serving prison time for their penchant for a number Swift-stalking related crimes.

Considering the scans weren’t turned over to authorities, it’s unclear for precisely what purpose the facial recognition technology was used. The whereabouts of the scans are unknown, and the Swift camp has remained mum on the entire subject; it’s clear they have no plans to disclose any information on the subject.

From a legality standpoint—and because concerts are considered private events and are open to, among other security measures, surveillance—there’s no indication that Swift broke any privacy laws, even though concertgoers weren’t notified that they were being surveilled. Oh, but don’t worry—the ACLU is looking into it.

Security Concerns?

To find out how to secure your organization’s network, contact GDT’s tenured and talented engineers and security analysts at SOC@GDT.com. From their Security and Network Operations Centers, they manage, monitor and protect the networks of companies of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

Read more about network security here:

Step aside all ye crimes—there’s a new king in town

Q & A for a Q & A website: Quora, what happened?

They were discovered on Google Play, but this is no game

And in this corner…

Elections are in, but there’s one (1) tally that remains to be counted

Hiring A Hacker Probably Shouldn’t Be Part of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

How leasing can help guide your digital transformation journey

By Richard Arneson

Turn on your radio and you’ll probably hear about it within the first two (2) commercial breaks—Heavens to Betsy, why are you buying instead of leasing? Radio-wise, they’re probably talking about automobiles. But in many industries, decision makers are asking themselves that very same question. Information technology is no exception, but why would it be? The cost to stay relevant from a technological standpoint, especially if you’re digitally transforming your infrastructure to drive business initiatives and shape the future of your organization, takes a wad of cash. Leasing can take care of that.

Following are some of the benefits your company can enjoy if you’re trying to determine whether leasing network equipment is the way to go.

Relevance

In technology, what’s great today can be tomorrow’s afterthought. It changes so fast, and the need to stay technologically relevant is critical to the success of any organization. And with digital transformation, it’s no longer about simply providing high-speed Internet and connecting offices with an MPLS network. It’s about transforming data centers, hyperconvergence, composable infrastructures, storage solutions, cloud migrations, blending DevOps with cloud strategies, etc. The list is seemingly endless, and will certainly seem so if you’re trying to budget for any, or all, of the aforementioned technologies.

Leasing equipment can help ensure you’re implementing network assets that will keep your digital transformation journey progressing down the track. And the ability to utilize the latest and greatest technologies is, according to a 2005 study by the Equipment Leasing Association, the number one (1) reason companies turn to leasing. Concerns about technological obsolescence are taken off your shoulders and transferred to the lessor’s.

Budgeting

It’s probably a word that doesn’t bring a smile to your face, but setting budgets and adhering to them is the kind of stuff that keeps finance departments up at night. Leasing equipment provides a set, predictable expense you can count on. It moves spending from the CapEx to the OpEx bucket, and does so without upfront costs. And low, or no, upfront costs mean more cash flow.

In technology, purchasing equipment means you’re probably planning on utilizing it for a long time, otherwise you won’t get to enjoy the tax benefits that listing it as a depreciating asset provide. But, again, technology changes. It requires careful consideration, along with loads of dough, to pull the trigger on purchasing it.

Competitive Edge

Leasing, especially for small businesses, allows them to stay competitive in the marketplace. But that’s not to say it won’t provide the same for larger corporations. Keeping up with your competitors
―and hopefully surpassing them―without the financial burden of purchasing equipment can be a significant short- and long-term boon for your business.

Upfront costs

As previously mentioned, leasing equipment doesn’t require deep pockets to foot the upfront costs required for ownership. In many cases, there are zero (0) upfront costs, especially regarding the IT industry (automobile leases don’t share the same philosophy).

This is not to say that equipment ownership doesn’t provide benefits, as well, but leasing it in today’s digital transformation world is a cash-friendly, predictable-payment way to ensure your organization doesn’t get left behind.

Want to learn more about how leasing can help pave your road to digital transformation? Talk to these experts

GDT Financial Services provides customers with full-service financial solutions for IT products, services and solutions. They’ve structured leasing arrangements that have met the needs of companies large and small, and from a variety of industries. For information about how they can help financially guide you on your digital transformation journey, contact them at financialservices@gdt.com.  They’d love to hear from you.

Our neighbors to the north know a thing or two about generating heat

By Richard Arneson

It would be impossible to argue that cryptocurrency is a green initiative, at least from an electricity perspective. It’s estimated that the power it takes to mine cryptocurrency worldwide could power Ireland—yes, that Ireland.

The amount of computing power required to mine cryptocurrency is staggering, so much so that Quebec, home to Quebec City, Canada’s capital, is hiking rates by seventy-five percent (75%) for cryptocurrency companies. With the lowest kilowatt per hour rates in North America, Quebec has been the real estate of choice for crypto miners. And even though most of Quebec’s power grids are fed by green-friendly hydroelectricity due to the province’s abundance of lakes, streams, dams, etc., overages rely on good old-fashioned electricity―the eco-unfriendly kind. But with problems come solutions, and this one (1) is a doozy.

Mining for Heat

Heatmine is a Quebec startup that believes it has an answer to the energy drain that is crypto mining. It hasn’t figured out a way to reduce the computational power needed, but it is experimenting with another way to make use of the massive amounts of 1’s and 0’s—capturing the heat generated from the equipment.

If you’ve ever been in a data center, you understand the amount of heat computational equipment produces. And the frigid data center temps tell you another thing about equipment—it does’t like heat. So Heatmine has developed a solution to relieve crypto miners of that unwanted heat.

Their device looks like a Rube Goldberg concoction that combines metal cabinets, hot water heaters, graphic cards, PVC, copper pipes, and a lot of clunky junction points. It sits atop the equipment and pulls heat from it, much like belt-driven fans draw heat away from engines. The hot water heater sits inside the cabinet with dozens of graphic cards, all of which are connected to a maze of pipes. The equipment-generated heat gets converted into hot water, which is stored in, fittingly, the hot water heater. From there, it does what hot water heaters do—wait for hot water taps to open when hand washing, showers or radiant heat is needed.Each unit, which is installed outside whatever area houses the equipment, comes equipped with 3G data connectivity, which allows for remote management. Heatmine claims that a single unit can provide heat up to three hundred (300) square meters of space, or generate 75,000 BTUs per hour (not bad, considering that 300 sq. ft. space is probably Canada). Heatmine is so confident in its solution—albeit a rather unsightly one—that they’re footing the bill to install the units at crypto mining companies. They just want their heat, and if they’re given it, the crypto mining company will get their electricity for free. 

Here’s the (potential) rub

As of yet, Heatmine hasn’t made it clear exactly how businesses that aren’t in close proximity to crypto mining companies can get access to all of that hot water. Yes, that’s a problem. But hot water aside, Heatmine’s website doesn’t explain how they’re going to provide customers with bargain basement electricity pricing. Apparently, for now they’re leaving a few of the details up to our imaginations.

Heatmine’s invention is fascinating, the technology is interesting, it’s renewable energy goals are admirable, but as of yet it’s hard to tell if their business model is half- or fully-baked.

Questions about your organization’s digital transformation journey?

For more information about how your organization can develop or enhance its road to digital transformation, call on the expert solutions architects and engineers at GDT. For years they’ve been helping customers of all sizes, and from all industries, realize their digital transformation goals by designing and deploying innovative, cutting-edge solutions that shape their organizations and help them realize positive business outcomes. Contact them at SolutionsArchitects@gdt.com or at Engineering@gdt.com. They’d love to hear from you.

Is Blockchain a lie detector for the digital age?

If you fall victim to it, you won’t end up marking it as “like”

They were discovered on Google Play, but this is no game

Blockchain; it’s more than just Bitcoin

When being disruptive is a good thing

Step aside all ye crimes—there’s a new king in town

 By Richard Arneson

It may not attract the attention of Hollywood like jewel heists or the mob, but neither can hold a candle to cybercrime―at least financially. In a recent study by The Herjavec Group, headed by Canadian entrepreneur and Shark Tank regular Robert Herjavec, cybercrime will pass $6 trillion dollars in the next three (3) years. Yes, $6 trillion. Write 6, then follow it with 12 zeroes—that’s a lot of cybercrime. And because that figure is double what it was just two (2) years ago, it stands to reason that $6 trillion will someday, perhaps as early as 2025, sound like a pittance.

No crime in the United States is growing at a faster rate―not even those associated with illicit drugs. The report estimates that $1 trillion will be spent fighting cybercrime over the next three (3) years. But perhaps the greatest threat to fighting cybercrime doesn’t have to do with money, but the lack of professionals who want to specialize in IT security.

It’s a bird…it’s a plane…no, it’s a cybercrime fighter

Several reports estimate that by 2021 there will be almost 4 million unfilled cybersecurity positions. One (1) reason for the shortage seems fairly intuitive given the precipitous rise in cybercrimes—companies can’t keep up with the demand. As Watergate informant Deep Throat advised to reporters Woodward and Bernstein: “Follow the money.” Yes, cybercrime pays, and pays very well. And the more ill-gotten gains, the more miscreants enter the “profession”.

Part of the security job gap is a skills issue. Many companies don’t want to train IT personnel to become security experts, but want instead to hire somebody who can bring the experience, expertise, certifications and accreditations with them. Combine that hiring philosophy with the super high demand for security professionals and the numbers don’t add up. The trained, experienced IT security pros have already been snatched up. It’s not that hiring IT security professionals isn’t on the rise, it’s that many companies stymie the hiring numbers by looking for experts that have no intentions of leaving their company. Remember, it’s in high demand, which means they’re also making a lot of dough.

Ransomware—cybercrime’s current business model

It’s predicted that a company is hit with a ransomware attack every fourteen (14) seconds (that number jumps to eleven (11) seconds by 2021). And the bad guys are getting better at it. Their shotgun, spray and pray approach has been supplanted by launching more targeted and effective infections. Ransomware is still the king of cybercrime. The FBI estimates that ransom payments hit $1 billion last year, and that total damages due to ransomware―including lost and destroyed data, halted productivity, theft of intellectual property, stolen customer, company and employee data, network outages, lost productivity, etc.―exceeded $5 billion. They believe that figure will hit $20 billion within three (3) years.

Dollars, productivity and copped data aside, there’s another fear that’s been voiced by somebody who knows a thing or two (2) about crime. According to Frank Abagnale, who’s shady, crime-fueled past was the subject of Steven Spielberg’s 2002 film Catch Me If You Can, “Up until now it’s just a financial crime for the purpose of stealing money―or stealing data that is money―but we have the ability now to turn someone’s pacemaker off.”

Security Concerns?

To find out how to secure your organization’s network, contact GDT’s tenured and talented engineers and security analysts at SOC@GDT.com. From their Security and Network Operations Centers, they manage, monitor and protect the networks of companies of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

Read more about network security here:

Q & A for a Q & A website: Quora, what happened?

They were discovered on Google Play, but this is no game

And in this corner…

Elections are in, but there’s one (1) tally that remains to be counted

Hiring A Hacker Probably Shouldn’t Be Part of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

Farmers may soon have a new, hard-working friend

By Richard Arneson

First off, check today’s date. You’ll notice that it’s not April 1st. Yes, what you are about to read is real—and nothing short of amazing. And, perhaps best of all, it’s an IoT story.

Researchers at The University of Washington have developed a backpack for (wait for it, wait for it) bees. While it’s unclear how they affix them (imagine trying to get their little arms through the straps), the backpacks are so light—.0035 ounces—that they allow the little critters to buzz around unfettered. They’ve yet to determine if the bees are burning more calories than normal, but it stands to reason they are. It just so happens that bees also weigh about .0035 ounces. But cardio aside, the best part is what the backpack carries—data!

In the event you’re unaware, farmers currently monitor crops with drones to, obviously, increase production and revenue. Thermal imaging from drone-captured video can quickly provide better views of the crop canopy, which tells them, among other things, which farming methods work the best. And while the bees aren’t capable of providing video—yet—they don’t require hours of battery recharging. The backpacks can gather up to seven (7) hours of data; find a drone that will stay in the air longer than thirty (30) minutes. Currently, the backpack sensors can only store about 30 kB of data, which limits them to collecting basic information related to light, temperature and humidity. But, as researchers do, they’re looking for ways to collect more, even live, data.

Nope, it’s not GPS

The researchers had to skirt the need to utilize GPS, which is a power hog. They got around this by scattering broadcasting antennas that, through triangulation, can detect the backpack’s position based on its signal strength. Collected data is then sent by reflecting radio waves from nearby antennas, a process known as backscatter.

Yes, there are control issues. Until they can be properly trained, there’s no telling where the bees will go (no, they’re not attempting to train them). The researchers are working on ways to collect data only when the bees are flying over certain areas. But the good news is that bees go back to their hives, where the backpacks will get recharged wirelessly (try finding a charging port on a crumb-sized backpack). As yet, there’s no information on how they’ll keep rogue bees from flying off with the backpacks, then selling them on eBay.

Internet of Things (IoT) questions?

GDT’s tenured, talented solutions architects, engineers and security analysts understand how to design and deploy IoT—including Smart City—solutions for enterprises, service providers, government agencies and cities of all sizes to help them enjoy more productivity, enhanced operations and greater revenue. GDT helps organizations transform their legacy environments into highly productive digital infrastructures and architectures. You can reach them at SolutionsArchitects@GDT.com or at Engineering@GDT.com. They’d love to hear from you.

You can read more about IoT and Smart City Solutions below:

Why Smart Cities? It’s in the numbers

Five (5) things to consider prior to your company’s IoT journey

Without Application Performance Monitoring, your IoT goals may be MIA

How does IoT fit with SD-WAN?

GDT is leading the Smart Cities Revolution

In Information Technology, it’s the biggest thing going

By Richard Arneson

It’s big. It’s sweeping across college campuses. It’s the new rage. No, it’s not a band, a movie or some nonsensical fad, it’s…supply chain. While it may not be offered at liberal arts colleges, if a school offers a business degree you can count on supply chain classes being filled to the brim. It’s quickly pushing marketing, finance and management to the side and becoming THE business degree to earn. My son, who’s a senior in college, has five (5) roommates; three (3) of them are getting a degree in Supply Chain. Oh, and they all have very good jobs lined up when they graduate in five (5) months. Yes, supply chain is big, and rightly so.

Supply chain is simply how a company works with its suppliers to ensure products get to consumers quickly and efficiently. And, of course, it brings revenue to a company and its suppliers faster, so their money can be making them money sooner. While the words Supply Chain to describe the aforementioned were coined approximately 35 years ago, it’s been around as long as vendors have offered products made by somebody other than themselves. It chains together all individuals or companies that help bring a product to market, from the rawest of materials to the finished, consumer-ready product.

Supply Chain, meet Digital Transformation

Digital transformation may be to the IT world what supply chain is to the business world. Everybody wants digital transformation (some want it but don’t know exactly why), but both it and supply chain deliver, if done well, the same thing―a competitive advantage in the marketplace. In industrial environments, where extraordinary amounts of information and traffic need to reach a highly mobile workforce, it’s especially effective.

By digitizing warehouses and distribution facilities, costly, time-consuming manual errors can become a thing of the past. Siloed operations and processes mean poor system integration, which can result in losses of revenue and customers. IoT assets in the form of sensors and cameras can help monitor operations and protect against theft and loss, and cybersecurity measures can protect sensitive customer, supplier and company information.

By connecting warehousing equipment and systems, such as sensors, smart mobile devices, automated sorters and conveyors, and security systems, companies can stay on top of operations and warehouse systems. Also, they can access key data and information from a variety of sources to help develop solutions to address any issues that are brought to light. And maintenance of equipment can be monitored to ensure operations run smoothly and continue to do so in the future.

And once products leave the warehouse, shipments and fleets, such as delivery trucks, are more easily and accurately managed.

Now step from the warehouse and into the store. Yes, retailers have taken big hits over the past few years thanks to online shopping, but they, too, are enjoying the rewards of digital transformation. For retailers, those rewards are coming in the form of collected data, which, when utilized, can help them make insightful, customer-centric decisions. For example, is there a need to offer new or additional product lines? Are shoppers’ in-store needs being met? Retailers that are successfully enjoying digital transformation have become less about products and more about the customer.

Questions about your digital transformation journey?

For more information about how your organization can develop or enhance its road to digital transformation, call on the expert solutions architects and engineers at GDT. For years they’ve been helping customers of all sizes, and from all industries, realize their digital transformation goals by designing and deploying innovative, cutting-edge solutions that shape their organizations and help them realize positive business outcomes. Contact them at SolutionsArchitects@gdt.com or at Engineering@gdt.com. They’d love to hear from you.

You can read more about Digital Transformation and how it’s changing the business landscape here:

Workshops uncover insights into the state of IT and Digital Transformation

What is Digital Transformation?

The only thing we have to fear is…definitely not automation

Without application performance monitoring, your IoT goals may be MIA

When implementing a new technology, don’t forget this word

When being disruptive is a good thing

Automation and Autonomics—the difference is more than just a few letters

Is blockchain a lie detector for the digital age?

By Richard Arneson

We’re almost to the point when people no longer hear blockchain and think Bitcoin. Blockchain is now helping businesses and organizations tackle a variety of issues. For instance, Walmart has turned to blockchain to keep customers safe from E. coli-tainted produce (read about it here). The United Nations World Food Programme (UNWFP) recently provided cryptocurrency-based food vouchers to thousands of Syrian refugees. Healthcare organizations are beginning to utilize blockchain to better track patients after they’ve left the hospital. Several real estate companies have turned to blockchain to manage the cumbersome legal procedures and processes related to the sale or transfer of property. And blockchain is now addressing the age-old question that has haunted hiring managers for years—are job candidates being truthful?

Eight (8) out of ten (10)

A 2017 study conducted by HireRight, a background screening company, found that eight (8) out of ten (10) people lied on their resume (actually, the figure is 85%, up from 66% just five (5) years ago). While the study doesn’t distinguish between the magnitude of the lie (slight fib vs. claims of inventing the cure for polio), it’s still a shocking number, especially considering the current job market is the best in decades. It’s a candidate’s job market; why the need to lie?

The Gig Economy

The word gig economy is gaining steam. If you’re unaware of its 2018 meaning, it refers to the move toward hiring contracted workers for set periods of time. Recent studies predict that the percentage of contracted workers will be anywhere from thirty to forty percent (30-40%) within the next couple of years. But with the increasing velocity to hire contracted workers comes some downside, at least for hiring managers―how can they vet candidates better and faster? With the gig economy, there are a lot more candidates to scrutinize.

The process of vetting, selecting and onboarding candidates is timely and laborious. It’s already costly, but in the event a candidate slips through the vetting process with a lot of creative writing on their resume, the cost skyrockets. The entire process starts over, but there’s no guarantee it won’t happen again.

It’s ingenious

Blockchain’s encrypted public ledger structure is actually an ideal—and certainly creative–solution for the staffing industry. For instance, a blockchain could be used to verify a candidate’s work history, including their tenure, job title, achievements and supervisor. A candidate’s former employer can easily add to the public ledger as it relates to their company, and hiring managers and recruiters can simply check the ledger for verification purposes. The candidate would have to provide the past and any future employers access to their wallet, within which the credentials are located. And if they refuse to provide access, you’ve now got a big, red flag waving in your face.

Once employment has been verified, it’s time-stamped and doesn’t require additional action. Future employers can easily check candidates’ work histories, which would greatly speed up the hiring process. No more waiting for days to get a call back for employment verification. And once employment, education, references, certifications and accreditations are verified, they can’t be altered by either the candidate or the verifier.

But it doesn’t just benefit the hiring company

Candidates can provide access to whomever they’d like, and, with a cryptographic key, they can protect their background information with the level of privacy and security they’d like. And with blockchain, they can start earning a paycheck much faster.

Questions? Turn to the Experts

If you have questions about what to look for in an IT staffing solutions firm, contact the staffing professionals at Staffing@gdt.com. They’d love to hear from you. Some of the largest, most notable companies in the world have turned to GDT so key initiatives can be matched with the right IT professionals to drive projects to completion. GDT maintains a vast database of IT professionals who maintain the highest levels of certifications and accreditations in the industry. And they understand the importance of finding professionals with the right soft skills. In addition, the IT professionals they place have access to the talented, tenured solutions architects, engineers and professionals at GDT.

To learn more about IT Staffing, read the following…

Utilizing an IT Staffing Solutions firm boils down to savings, whether in dollars or time

IT Staff Augmentation–it’s about more than just the resume

Do you need Staff Aug or Outsourcing—or both?

How Companies are benefiting from IT Staff Augmentation

And read about how GDT’s Staffing Solutions team helped a financial services firm find the perfect candidate in record time

Why Smart Cities? It’s in the numbers

By Richard Arneson

We’re living in a world where some of the most mundane of tasks can be offloaded thanks to artificial intelligence (AI), machine learning and predictive analytics. Whether regarding businesses or individuals, time-consuming tasks and processes can grind down productivity and prevent individuals, teams and departments from working on initiatives that should be helping shape their company’s future for years to come. And our personal lives are benefiting tremendously from the Smart Revolution. Healthcare apps are freeing up time that patients would have otherwise spent sitting in traffic or in a waiting room flipping through a 5-year-old copy of Good Housekeeping. You can monitor your home’s temperature, lighting and security while sipping a margarita on a Cancun beach. Yes, the smart life is simpler, less stressful and, perhaps best of all, more fiscally-minded. And in the off chance you haven’t heard, cities, both big and small, are beginning to see the light.

A Smart approach to combat budget cuts

Try to find city that hasn’t experienced budget cuts in the past twenty (20) years. It’s rare, if one even exists. Police forces and fire departments have become understaffed, bone-jarring pot holes remain unrepaired, parks are left untended and unsafe, youth sports and arts districts go unsupported; the list goes on and on. Reading further down it quickly becomes disheartening. And what’s the most common way cities try to combat budget cuts? Higher taxes. Yay!

But some cities, at least the smarter ones, are turning to a smarter solution and transforming themselves into a Smart City. The following stats and figures provide a brief list of some of the many reasons they’re doing so:

Lighting alone composes up to forty percent (40%) of cities’ utility bills

By utilizing sensors and a scalable platform to smartly address street lighting, both utility costs and those related to crime can be greatly reduced. Without it, energy is wasted and cities’ carbon footprint will continue to grow. IoT-enabled lampposts allow lights to be dimmed or turned off in the absence of nearby traffic, whether foot or vehicular.

Thirty percent (30%) of traffic congestion is due to drivers looking for a parking space

Smart Cities are better and more rapidly directing drivers to areas where parking is available. As a result, traffic congestion and emissions are reduced, and time and cost savings―and far happier citizens―are the result. And don’t forget what IoT-based ride sharing solutions, along with bicycle and scooter rentals, are doing to reduce traffic snarls and emissions.

Air pollution costs municipalities $1.7 trillion dollars per year

With location-based and real-time monitoring of air quality, Smart Cities can quickly determine which parts of town have the highest emissions levels. This level of information allows them to determine what is causing the disparities in air quality, and develop measures to fix them. In addition, citizens will be equipped with better, more timely information about when to venture outdoors or remain inside.

Traffic congestion costs drivers $300 billion each year

With data gathered from connected vehicles, city workers can be more quickly deployed to address congestion due to everything from abandoned vehicles, road debris or other safety issues.

Crime, and indirect costs related to it, total $3.2 billion annually

With sensors and connected first responders, Smart Cities can more efficiently monitor and respond to crime-related incidents. And retail districts, among other areas, soon reap the rewards, as consumers feel more secure to frequent local businesses and stay for longer periods of time.

Cities report a sixty percent (60%) inefficiency rate regarding trash collection

Smart Cities can better direct trash collection personnel to areas and collection receptacles that require immediate attention. This level of information ensures collections are conducted more efficiently, which not only saves time and money, but improves air quality.

The Smart City Revolution has already saved thousands of cities across the globe money and jobs, greatly reduced budget cuts, and helped keep their city safer, cleaner and healthier. Becoming a Smart City is certainly a fiscally and societally responsible goal, but to get it beyond the dream stage requires a high level of insight, empirical experience and planning. “It’s important to take a holistic view of the city’s entire financial and infrastructural landscape prior to attempting a Smart City migration,” said Alllen Sulgrove, GDT’s Director of its Smart City and IoT Solutions practice. “It’s only when data and technologies are tightly integrated to address particular needs that municipalities will enjoy the full value, from a societal and economic impact, of becoming a Smart City.”

Before acting on your dreams to become a Smart City, consult with experts who’ve done it

Becoming a Smart City only becomes beneficial to municipalities if its deployment results in economies of scale, cost efficiencies, optimization of resources, better customer service and satisfaction, and, ultimately, higher revenue. And that’s why consulting with Smart City and IoT professionals like those at GDT is critically important. GDT’s tenured, talented solutions architects, engineers and security analysts understand how to design and deploy Smart City solutions for cities of all sizes to help them realize more productivity, enhanced operations and greater revenue. GDT helps organizations transform their legacy environments into highly productive digital infrastructures and architectures. You can reach them at SolutionsArchitects@GDT.com or at Engineering@GDT.com. They’d love to hear from you.

You can read more about Smart Cities and IoT Solutions below:

Five (5) things to consider prior to your company’s IoT journey

Without Application Performance Monitoring, your IoT goals may be MIA

How does IoT fit with SD-WAN?

GDT is leading the Smart Cities Revolution

Don’t sell fiber optics short―what it can also deliver is pretty amazing

 By Richard Arneson

Forget for one (1) second that it’s super-fast (only slightly slower than the speed of light), has far less attenuation (signal loss) than copper or coax cable, is impervious to EMI and RFI, doesn’t pose a fire hazard and doesn’t require replacement nearly as often as its transmission counterparts. Fiber optics can also be used to detect earthquakes. Yes, earthquakes.

Researchers at the Department of Energy (DoE) Office of Science, located and managed at The University of California in Berkeley, CA., have been hoarding insane amounts of seismic data for years that they’ve collected from dark fiber buried in and area the Bay Area. From their over 500 terabytes of collected data, they analyze it to help monitor landslides, sinkholes, changes in injected carbon dioxide and, most importantly, earthquakes. By deploying what they term “distributed acoustic sensing”, the buried, unused fiber helps them measure seismic waves and achieve results comparable to traditional seismometers. It’s a game-changer in the world of seismology.

Out with the old…well, not just yet

Seismometers are what is currently used to detect the tiniest of tremors above ground and under it. But one (1) seismometer can’t detect movement over expansive areas. And it takes at least three (3) to detect the epicenter of a single earthquake. Think about the land mass in California alone―over a hundred million acres. That’s a lot of seismometers. While they can be linked together to expand their range of detection, they’re difficult to manage and maintain. And they’re the diamond of technologies; very small, but extremely pricey.

The dark fiber that DoE researchers monitor transforms vibrations into data they collect and analyze. While normal, ambient vibrations caused by traffic, construction, etc., have to first be parsed, then jettisoned, researchers can then get down to the business of identifying vibrations related to earthquakes, sinkholes and landslides.

The idea to utilize fiber optics to measure Earth’s tremors actually came from the oil and gas industry. After years of feeling trucks’ movements under their Tony Lama’s, petroleum engineers wondered if there was a way to measure movements located underground, such as those related to oil wells and pipelines. They opined that fiber optics may be the answer. They were right. It worked, and worked well. It’s still being used today for those purposes.

Here’s how they do it

Yes, it’s amazing, but it’s actually quite simple. When light is shot down the dark fiber, any impurities in the fiber scatters the light. Researchers use a laser interferometer to measure this scattering, which is caused by the pushing, pulling, pinching and squeezing of the fiber. And precisely how it’s being manipulated is caused by vibrations and tremors, even the tiniest of them.

With the glut of dark fiber available due to the telecom industry’s 1990’s arms race to get as much of it buried, strung or dropped in the ocean as possible, there’s no lack of it for seismographers to access.

For questions, turn to these optical networking experts

If you have questions or would like more information about fiber optics or optical networking, contact GDT’s Optical Networking practice professionals at Optical@gdt.com. Composed of experienced optical engineers, solutions architects and project managers who specialize in optical networks, the GDT Optical Networking team supports some of the largest service providers and enterprises in the world. They’d love to hear from you.

For additional reading about fiber optics, check these out:

A fiber optic first

When good fiber goes bad

Busting myths about fiber optics

Just when you thought it couldn’t get any better

And see how GDT’s talented and tenured Optical Networking team provided an optical solution, service and support for one (1) of the world’s largest social media companies:

When a redundancy plan needs…redundancy

They know they want it, but don’t know how to get it…or really understand why they need it

By Richard Arneson

According to a recent study by a London-based research and analytics firm, ninety-one percent (91%) of executives realize the importance of AI and machine learning technologies. However, only slightly more than half of them are currently utilizing it. And less than twenty percent (20%) know how it’s being used in their organization, which makes you wonder if that “slightly over half” figure is really accurate. They know they want it and, as you’ll read below, know how they’d like to use it, but that’s often where the dream dies.

Here’s the problem

At the core of executives’ issues with implementing AI and machine learning is because they don’t really know how to communicate its benefits. They can’t speak to how or why it’s being, or needs to be, used.

These percentages are growing rapidly, but the key decision makers who are in charge of making AI and machine learning happen throughout their organizations are in the same boat. They don’t have enough information or skill sets on board to help match needs and/or adoption levels with AI and machine learning.

It’s one (1) thing to understand that changes need to be made, but another to know precisely where the change is needed. And without fully understanding what AI and machine learning can accomplish, it’s difficult to know what issues it can address. They know adoption is needed to stay competitive in the marketplace, but forking over a lot of dough without knowledge of the technologies and what they can provide is throwing good money after bad…executives don’t like that.

What else they know

The study also uncovered how executives would like to utilize AI and machine learning. Well over fifty percent (50%) want it to enhance employee productivity, which doesn’t come as a surprise. Find an executive who doesn’t want to improve employee output, and you’re looking at an organization that isn’t long for this world. In addition, they want AI and machine learning to help them make better business decision and streamline business processes.

Of those executives whose organizations are currently utilizing AI and machine learning, its ability to automate decisions ranks first, as forty percent (40%) utilize it for exactly that purpose. Second was customer satisfaction and retainment at thirty-six percent (36 percent), with deploying a better way to detect waste and fraud ranking third at thirty-three percent (33%).

Trust

Another study, this one (1) from Deloitte based on interviews with executives regarding AI and machine learning, discovered that the notion of putting blind faith in results is terrifying. The executives know that data, and loads of it, is being analyzed, but haven’t the foggiest what’s done with it and, most importantly, whether what comes out the other side is trustworthy.

And part of that fear lies in the word algorithm. When it’s thrown about, it tends to make people squirm. Whether referring to math, SEO or AI and machine learning, many tend to tense up when they hear it, or stop listening altogether. Algorithm may sound cryptic, but it shouldn’t; it’s simply a mathematical equation. It’s the task(s) algorithms perform, and how they calculate answers and results, that can give people a case of tired head. They think back to that C they got in high school algebra and promptly decide that they’re not equipped to participate in any discussion that includes the word or has anything to do with it.

The aforementioned figures will steadily rise, of course, but its rapidity depends on how much executives understand what AI and machine learning can accomplish, and how to explain its benefits to others, like shareholders.

Education breeds Trust…here’s how to get both

GDT’s tenured, talented solutions architects, engineers and security analysts understand how to positively incorporate change by designing and deploying innovative solutions and technologies―including AI and machine learning―that help customers realize greater productivity, enhanced operations and more revenue. GDT utilizes key partnerships with best-of-breed technology companies to help organizations transform their legacy environments into highly productive digital infrastructures and architectures. You can reach them at SolutionsArchitects@GDT.com or at Engineering @GDT.com. They’d love to hear from you.

You can read more about how AI and Machine Learning are helping organizations, and can help yours, transform your business, click any of the following:

Answer: you get a better solution than your current one

AN AI answer from a VIP provider

Unscrambled, these letters represent some of the hottest topics in the IT industry

Automation and Autonomics: the difference between them is more than a few letters

The only thing we have to fear is DEFINITELY not automation

When being disruptive is a good thing

If you fall victim to it, you won’t end up marking it as “like”

By Richard Arneson

Apparently, scammers get bored, too, at least the ones who find it fun and profitable to generate hustles related to cryptocurrency. They’ve found a new target—Facebook. Their scamming medium of choice has primarily been Twitter, which has for months been littered with fake cryptocurrency advertisements. For Facebook, however, they’ve modified their strategy and tactics. On Twitter, their basic, garden variety scam has been the infamous Bitcoin giveaway (tip: if it’s a giveaway, it’s you who will be giving away something.). For Facebook, their tactic involves luring users into coughing up sensitive info, such as the holy grail of scamming–credit card information.

Here’s how it works…on Facebook, at least

The attackers (I call them miscreants) set up phony pages with a call-to-action in the form of a fake, sponsored ad. After clicking on it, users are directed to a replica CNBC page that promotes an investment opportunity. While claims of big investment opportunities should be the first clue that you’ve ventured into murky digital waters, if it doesn’t and you end up there, you’ll be given the opportunity to purchase a new, shiny cryptocurrency from CashlessPay.

According to the ad, Singapore just announced they’re adopting an official coin, which can only be purchased from CashlessPay. Oh, and it includes fake endorsements from sundry celebrities, including Sir Richard Branson, famed English entrepreneur and owner of The Virgin Group. Gee, if Richard Branson invested in it, it must be good. I’m all in! And that’s exactly what they’re praying to The God of Cybercrime that you’ll be thinking. And, of course, once you pull out your plastic cash and enter in a few digits, you’ve just become a victim. You’ll soon unknowingly purchase high-end electronics throughout the world.

Is Facebook asleep at the wheel?

It seems odd to most that these malicious ads got past Facebook and Twitter in the first place. In the Facebook case, the miscreants were able to slide past their defense mechanisms, odd considering that earlier this year they banned all blockchain and cryptocurrency advertisements. It’s not clear exactly how they circumnavigated Facebook’s security sentinels, but obviously they did. It is interesting, though, that phony cryptocurrencies require payment via bank wires or credit cards.

Twitter appears to be the first social media victim, but they’re not flattered

While Facebook has been scammed for what appears to only be a matter of weeks, Twitter has been battling fake cryptocurrency ads for the past nine (9) months. Initially, Twitter scammers launched armies of bots that mass-spammed links to cryptocurrency giveaways. They tweaked their approach and decided to implement a more selective spamming model. They began hijacking real profiles; one (1) of their favorites was Elon Musk. Other targets soon followed, including several politicians and government accounts. Their piece de resistance? Google and Target, who both fell victim to the scam.

The question now: “Can Facebook remediate this issue faster than Twitter?”

We’ll see.

Security Concerns?

To find out how to secure your organization’s network, contact GDT’s tenured and talented security analysts at SOC@GDT.com. From their Security and Network Operations Centers, they manage, monitor and protect the networks of companies of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

 

Read more about network security here:

They were discovered on Google Play, but this is no game

And in this corner…

Elections are in, but there’s one (1) tally that remains to be counted

Hiring A Hacker Probably Shouldn’t Be Part of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

When customers are inspired to share their experiences, these types of awards carry far more weight

By Richard Arneson

There’s a reason so many of today’s awards are attached with the word “Choice”. Let’s be honest, they mean a little more; they carry more weight. And Choice Awards address what we’ve all suspected at one time or another—awards based on judging by a panel of “experts” are fraught with problems. Here’s an example, but keep in mind that I’ve conducted absolutely zero (0) research on the subject―in 1977, Star Wars DID NOT win the Academy Award for Best Picture. It made over a gazillion dollars; people in other solar systems lined up to see it. Not to belittle the winner, but…Annie Hall? Really?  I think this gross oversight is the very reason “Choice” awards were  invented in the first place. “Enough,” said moviegoers, “we’ll start selecting winners.”

 The 2018 Gartner Peer Insights Customers’ Choice Awards

There are awards for the people’s choice, the kids’ choice, the editor’s choice, and on and on. In the business world, though, there’s nothing better than winning a customers’ choice award. It suggests that you’re doing “it” well, whatever “it” refers to. No question, it’s great to be honored by industry experts, trade publications or professional associations. But when customers have stood up to proclaim that what you offer, deliver and manage is exactly what they’d contracted for, it carries more weight. No question.

Now combine a customers’ choice award with Gartner, one (1) of the IT industry’s most influential and trusted research firms, you’ve exponentially added more cachet into the mix—that’s no secret to Juniper Networks. They were recently named one of only three (3) recipients of a 2018 Gartner Peer Insights Customers’ Choice Award for Data Center Networking (there were dozens that didn’t make the cut).

Gartner’s Customers’ Choice Awards are presented to vendors who received at least fifty (50) published customer reviews in the last twelve (12) months, and when averaged have a rating of at least 4.2 stars. Juniper Networks received more than twice the required reviews, and comfortably exceeded the 4.2-star rating. But before you head down the conspiracy theory rabbit hole and suspect fictitious reviews were uploaded, consider who is presenting the awards—Gartner. They understand how to ensure evaluations are fair and untainted. Remember, they’re Gartner; they’re one (1) of the most noteworthy and quoted IT research firms on the planet.

What makes Juniper Networks so special in the world of data center networking?

Because it’s a customers’ choice award, I’ll let several of Juniper’s customers take this one. The following were actual quotes that Gartner utilized in calculating award winners (their titles range from IT administrator to CIO, and everything in between):

“Juniper EX switches are tanks that just work. Junos is a dream to work with.”

“Juniper allows smarter networks for less money.”

“Technology and, more importantly, the people at Juniper have proven to be second to none.”

“Juniper robust, wire speed and flexible technology.”

“Implementation has been easy, and the Juniper switches integrate well into our existing network.”

“Implementation ease, operations, and scalability are top-notch with Juniper.”

You get the idea. There are several dozen just like them that were crafted by technical professionals who work for companies of different sizes and from a wide array of industries. But they all discovered the same thing: selecting Juniper Networks was another type of choice—the perfect one.

Have questions about how your organization can soon enjoy the same features and benefits as these Juniper Networks’ customers? These experts have the answers

GDT’s tenured, talented solutions architects, engineers and security analysts understand how to positively incorporate data center solutions that help customers realize greater productivity, enhanced operations and more revenue. GDT utilizes key partnerships with best-of-breed technology companies, like Juniper Networks, to help organizations transform their legacy environments into highly productive digital infrastructures and architectures. You can reach them at SolutionsArchitects@GDT.com or at Engineering @GDT.com. They’d love to hear from you.

 

You can get more info below regarding the wealth of  technologies,  products, services and issues dotting the IT landscape:

Unwrapping DevOps

Autonomics and Automation–is there a difference?

Answer: you get a solution better than your current one

A-M-D-I-L-L: Unscrambled, these letters represent some of the hottest topics in the IT Industry

A Robust Solution for the entry-level storage customer

Don’t put off ’til tomorrow what you MUST do today

Want to read about a cool, real-world Blockchain application?

When being disruptive is a good thing

Rx for IT departments—a security check-up

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

Utilizing an IT Staffing Solutions firm boils down to savings, whether in dollars or time

By Richard Arneson

Raise your hand if you don’t work for an IT staffing agency. If your hand is still in the air, ask yourself this question ― “Do I feel like I’m in the IT staffing business?” You may feel that way if you’ve ever been tasked with finding technical and engineering talent, especially if it’s to complete a project or tackle an initiative that requires a carefully defined technical skill set and a high, and very particular, level of experience. And if you’re in need of augmenting your IT staff for a set period, this means your need to find this professional has been ratcheted up a notch or two (2). Staff augmentation is synonymous with “higher level of urgency.” Nobody wants the right candidate for a project in a few months. Projects are timely. If they’re not, they’re not really projects. They’re things you’ll eventually get around to. No need to augment your IT staff for those ones.

With the range of technologies, technical solutions and certifications that exist today, finding the perfect engineering professional with the right experience is like trying to find a left-handed pitcher with a mid-nineties fastball, throws a wicked curve and can drop in a slider that leaves batters flailing at any pitch within five (5) feet of the strike zone. But it’s more than finding the right certifications, education and experience. It’s also about finding the right soft skills, such as problem solving, communication, work ethic, and time and project management. And will they fit into the department and the company, from a personality, philosophical, even sense-of-humor standpoint?

Cost Savings

It’s been estimated through several studies that it cost approximately a hundred fifty percent (150%) of an employee’s annual salary to fill a position that pays over a hundred thousand dollars ($100,000) annually (Twenty percent (20%) for positions less than $30,000 annually; over two hundred fifty percent (250%) for executive-level positions).

Let’s say you’re looking for a Cisco Certified Network Professional (CCNP). They make, on average, approximately a hundred and ten thousand dollars ($110,000) per year. Expect to spend approximately a hundred and fifty thousand dollars ($150,000) to hire them. Yes, this percentage and the related costs sound staggering, even unbelievable, but when you really dig into the numbers, it starts to make sense. Consider these costs―man hours spent by internal employees and resources, advertising and job board costs, onboarding and training expenses, and fees related to software licensing, employee referrals and memberships. But don’t stop there…

Now consider the costs incurred from not having a professional working on a project or initiative. For instance, is not having them on staff preventing others from designing or deploying a new solution that could be saving costs and time, and/or generating revenue? The numbers start adding up fast, but they’re exorbitant if you need somebody for a set length of time and know they won’t be on staff for the next five (5) to ten (10) years. When spread out over years, the costs to find that CCNP are palatable. If they’re needed for less than a year, not so much.

Time Savings

As the saying goes, “time is money.” It’s trite, yes, but timeworn for good reason. In the business world, time is calculated in dollars. If members of your HR or recruiting team are spending hours (often months) trying to find the perfect technical professional, your company is burning through funds. You may not see invoices cross your desk or checks being cut, but they’re costs, just the same.

Often HR and recruiting professionals don’t put the level of focus on bringing in a professional when they know they won’t be a member of their organization for years to come. But taking this approach is a pay-me-now-or-pay-me-later proposition. If the right person isn’t placed, understand this—you’ll be repeating the entire recruiting process very soon. Trying to shoehorn somebody into a position they’re not qualified or right for will come out in the wash. And if it takes you two (2) months to figure this out, you’ve just spent over $18,000 for the wrong CCNP. While it’s money you won’t get back, you’ll be reminded of it often—especially come budget time.

Do they understand The Language?

If your internal recruiters spend most of their time trying to place professionals outside of IT, there’s a greater than zero percent (0%) chance they don’t understand Technology-speak. Yes, it’s a whole other language, but here’s the rub―some people speak it, but don’t understand it. Does your team know how to detect IT illiteracy? If not, they’ll probably get snowed, which will lead to poor candidate placement(s), additional, unbudgeted costs, frustrated looks from executives, and Ibuprofen runs for recurring staffing headaches. Not being able to detect misrepresentations in candidates’ abilities and/or experience is a budget killer. Your staff needs to know which questions to ask, how to ask them, and can discern when they’re being sold a bill of goods.

Call on the Experts

If you have questions about what to look for in an IT staffing solutions firm, contact the staffing professionals at Staffing@gdt.com. They’d love to hear from you.

Some of the largest, most notable companies in the world have turned to GDT so key initiatives can be matched with the right IT professionals to drive projects to completion. GDT maintains a vast database of IT professionals who maintain the highest levels of certifications and accreditations in the industry. And they understand the importance of finding professionals with the right soft skills. In addition, the IT professionals they place have access to the talented, tenured solutions architects, engineers and professionals at GDT.

To learn more about IT Staffing, read the following…

IT Staff Augmentation–it’s about more than just the resume

Do you need Staff Aug or Outsourcing—or both?

How Companies are benefiting from IT Staff Augmentation

CASE STUDY—GDT Staffing Services delivered the right professional―fast

They were discovered on Google Play, but this is no game

By Richard Arneson

It’s been over three (3) years since Google announced that developers could no longer publish applications on Google Play willy-nilly—that is, without their apps having first been vetted. But that vetting process is largely handled like it is on Apple’s App store—manually. Yes, people are their main source of Malware and app violation detective work. And when there are almost 3 million apps on Google Play, there’s plenty of room for oversight. When people are involved, mistakes are made. And that was made evident this past Tuesday (Nov. 13th) when Lukas Stefanko, a Malware researcher from Slovakia, published his findings. Stefanko discovered four (4) apps on Google Play that were designed to dupe users into inadvertently coughing up their cryptocurrency.

“The Crypto 4”

Stefanko discovered an app that appeared to be developed and offered by legitimate cryptocurrency Ethereum. The app was only downloaded a few hundred times due to its $388 price tag, but when multiplied several times over, the malicious developers did all right for themselves.

Stefanko discovered three (3) apps that mimicked legitimate cryptocurrency wallet companies NEO, Tether and MetaMask.

Cryptocurrency wallets generate a public address and a private key for the user. In the case of NEO and Tether, however, the user was unknowingly provided with the attacker’s public address. Once the app was launched, the user believed that public address had been assigned to them. Then the attacker used their private key to access funds the user had deposited. And when the user would try and access those funds, they didn’t have the private key to withdraw them. It was discovered that the fraudulent NEO and Tether apps were utilizing the same malicious public address.

The MetaMask scam phished for users’ wallet password and private key, asking them to provide both. And if the user believed they had accessed MetaMask—the real MetaMask—it’s quite possible they lost some of their treasured crypto.

Stefanko reported all four (4) scams to Google Security, and they were promptly removed from Google Play.

What is Google doing to prevent this?

They already have…sort of. On July 27th, Google followed Apple’s lead, banning crypto-mining apps that were carried on Google Play. (Apple banned them a month earlier, in June). Google gave developers a 30-day grace period to revise their apps to comply with the new ban. But as recently as last week, it was discovered (not by Stefanko, in this case) that there were still eight (8) crypto-mining apps available from Google Play. Google has reported that three (3) of those apps have been removed, but apparently the following still exist: Crypto miner PRO, Pickaxe Miner and Pocket Miner. Another, Bitcoin Miner, is still carried on Google Play, but is reportedly in compliance with Google’s revised terms.

But before you label Google as being grossly negligent, it’s important to note that last year they jettisoned over five hundred (500) apps that could have easily installed spyware on users’ devices. They’re not sitting by idly. These 500 apps had been downloaded over 100 million times. Thankfully the developer of these apps, Lgexin, wasn’t operating in a malicious manner. They had accidentally created a backdoor security vulnerability, but if they were so inclined, they could have infected millions of devices via malicious plugins.

They’re not banning everything

Google doesn’t have anything against cryptocurrency, just the mining of it on devices that can download apps from Google Play. Apps from cryptocurrency exchanges are still on there, and will be for many years to come.

Security Concerns?

To find out more about the many threats that may can target your organization, contact GDT’s tenured and talented security analysts at SOC@GDT.com. From their Security- and Network Operations Centers, they manage, monitor and protect the networks of companies of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

 

Read more about network security here:

And in this corner…

Elections are in, but there’s one (1) tally that remains to be counted

Hiring A Hacker Probably Shouldn’t Be Part Of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

A boy and his computer…and a $67 billion purchase

By Richard Arneson

Most people know the story. Everybody who hears it likes it. It’s the one about a University of Texas pre-med student who decided to start a computer company from his dorm room. He’d take a garden variety computer and, metaphorically speaking, put on high-end headers, a Flathead engine, a four-speed overdrive transmission and chrome mags. Yes, Michael Dell souped-up computers, re-sold them (a lot of them) and dropped his medical school plans. And now, thirty-five (35) years later, Dell has over a hundred forty thousand (140,000) employees globally and is certainly a juggernaut in the high tech industry.

Once known strictly as a PC vendor, Dell has—understatement alert!—branched out just a bit. That’s not to say they’ve let their PC focus fall by the wayside. In fact, they are the 3rd largest PC vendor—behind Lenovo and HP—with a market cap of over $70 billion. Now, of course, they’re known for much more thanks in part to their 2009 acquisition of Perot Systems, which helped make them an immediate player in the enterprise storage and networking solutions segment. Then along with PCs, storage and networking solutions, they added printers, servers, switches, cameras and HDTVs, to name a few, to their sales arsenal. And then came 2016.

If you thought the Perot Systems purchase was a biggie…

You’d have a hard time finding anybody, even the Wall Street types, who thinks Dell’s $67 billion purchase of EMC Corporation in 2016 was anything but a home run. And that’s rare…highly rare when you consider that most technology purchases or mergers have been labeled everything from “meh” to “disastrous”, and every negative adjective in between. Sure, Dell’s intentions were met with some naysayers at the time, but finding critics today will be tough.

After the EMC purchase, Dell was reorganized into Dell Technologies Capital, and its multiple divisions were consolidated into three (3) subsidiaries: Dell Client Solutions Group (consumers), Dell EMC (data management hardware and software) and VMware, which it’s a majority owner of due to its EMC purchase (Dell currently holds an eighty percent (80%) stake in the Palo Alto, CA-based software virtualization company). And a year after the acquisition, Dell EMC announced the formation of an IoT Division, which is being run by VMware CTO Ray O’Farrell. He revealed in August that they’ll be pumping over $1 billion into IoT Research & Development over the next three (3) years.

The Dell EMC Strategic Focus (in addition to IoT, of course)

Dell EMC’s Ready Solutions for AI

In August, Dell EMC introduced Ready Solutions for AI, which utilizes a building-block approach to help better meet customers’ AI needs as they evolve. Ready Solutions for AI includes machine and deep learning, servers, software, storage, networking and services optimized for AI workloads.

Data Management (servers, storage, analytics and cloud-based workloads)

The term Data Management can encompass, well, pretty much everything that’s IT-related. In Dell EMC’s case, their data management focus is composed of servers optimized for AI workloads (more specifically its PowerEdge C-Series servers, which are optimized for AI workloads) and network-attached storage platforms for backup and archiving, which are provided through Dell EMC Isilon and Elastic Cloud Storage.

Boomi, a company Dell purchased eight (8) years ago, specializes in cloud-based integration, API management and Master Data Management. They’re the analytics guys.

For cloud-based workloads, Dell EMC’s Pivotal Cloud Foundry and Virtustream Enterprise Cloud fit the bill. Pivotal Cloud Foundry, which was originally developed by VMware, is an open-source, multi-cloud application PaaS (Platform-as-a-Service). After the EMC purchase, Cloud Foundry was transferred to Pivotal Software, a joint venture between EMC, VMware and General Electric.

There will be a time, probably in the not-too-distant future, when hearing Dell without EMC will be like Exxon without Mobil. And, actually, that might be a great way to judge the success of high tech’s largest acquisition in history.

Have questions? These experts have the answers

GDT’s tenured, talented solutions architects, engineers and security analysts understand how to positively incorporate change by designing and deploying innovative solutions that help customers realize greater productivity, enhanced operations and more revenue. GDT utilizes key partnerships with best-of-breed technology companies, like Dell EMC, to help organizations transform their legacy environments into highly productive digital infrastructures and architectures. You can reach them at SolutionsArchitects@GDT.com or at Engineering @GDT.com. They’d love to hear from you.

 

Want more info about IoT, AI and data management and that encompasses? Check these out:

Unwrapping DevOps

Autonomics and Automation–is there a difference?

Answer: you get a solution better than your current one

A-M-D-I-L-L: Unscrambled, these letters represent some of the hottest topics in the IT Industry

A Robust Solution for the entry-level storage customer

Don’t put off ’til tomorrow what you MUST do today

Want to read about a cool, real-world Blockchain application?

When being disruptive is a good thing

Rx for IT departments—a security check-up

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

The FTC’s findings—and what they’re doing about it—regarding small businesses and cybersecurity

By Richard Arneson

By now, most of us are aware that cyber threats don’t discriminate. Any business, regardless of size, industry, location, et al., is a potential target. The media, of course, is going to focus on breaches that affect huge, public-facing, high-profile corporations whose names are recognizable (Target, Uber, LinkedIn, JP Morgan Chase, Home Depot, to name only a few). And that media focus can make small, even mid-sized, business owners feel like they’re somewhat hidden and tucked away, like a homeowner who’s selected a neighborhood off the beaten path. With cyber attacks, however, everybody’s on the path.

In fact, small businesses suffer more malware infections than their larger counterparts, and, according to the Ponemon Institute’s report 2017 State of Cybersecurity in Small- and Medium-sized businesses, that number is on the rise. At the time of its writing, small businesses had experienced a sixty-one percent (61%) rise in attacks during the prior twelve (12) months; in 2016, it was up fifty-five percent (55%). While it’s probably giving cyberattackers too much credit to believe they single out and target small businesses (they tend to utilize a spray and pray technique), there’s no question—small businesses are getting caught in the crossfire.

The FTC is doing something about it

October was cybersecurity month, which seems a little odd. Every month should be cybersecurity month. Every day should be cybersecurity month, if that makes sense. And the FTC agrees.

Over the last twelve (12) months, the FTC crisscrossed the country conducting interviews and having discussions with small- to mid-sized business owners. Those discussions brought to light one (1) primary theme as it relates to small businesses and cyber threats—they are bringing a knife to a gunfight. They saw the immediate need to launch a cybersecurity resource for small businesses to help ensure they’re protected, or at least heading in the right security-related direction.

The FTC teamed up with the Small Business Administration (SBA), the National Institute of Standards and Technology (NIST) and the Department of Homeland Security (DHS) to develop clear, easy-to-use resources, which includes training, quizzes and videos on the following key security topics:

  • Cybersecurity Basics
  • NIST Cybersecurity Framework
  • Physical Security
  • Ransomware
  • Phishing
  • Business Email and Email Authentication
  • Tech Support Scams
  • Vendor Security
  • Cyber Insurance
  • Web Hosting
  • Remote Access

Yes, security threats abound, but they’re not just related to external threats. According to another study by the Ponemon Institute, over seventy-five percent (75%) of businesses largely remain unprotected from malicious insiders and employees lacking proper security education. Security is a lot to think about, but don’t wait until next October to learn about how to protect your organization. Remember, every day is security month! And to get started, you can learn here how to give your business a security self-exam.

Don’t leave it up to chance

To find out more about the many threats that may soon target, or are currently targeting, your organization, contact GDT’s tenured and talented security analysts at SOC@GDT.com. From their Security- and Network Operations Centers, they manage, monitor and protect the networks of companies of all sizes, including those for some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

Read more about network security here:

Hiring A Hacker Probably Shouldn’t Be Part Of Your Business Plan

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

Workshops uncover insights into the state of IT and Digital Transformation

By Richard Arneson

Let me get this out of the way first—IT transformation isn’t the same as digital transformation. The former is more a means of getting your organization to the latter (Read more about digital transformation here). OK, now let’s get down to IT transformation.

GDT premier partners Dell EMC and VMware covered the country from coast-to-coast to conduct workshops on IT transformation. Wait, this just in—they conducted workshops across the globe! In the workshops, Dell EMC consultants worked with CIOs and their direct reports to evaluate their current IT state against where they’d like it to be. From that, they discussed strategies and tactics to bridge that gap. But, best of all, IDC performed an analysis of this information and published the results for all to enjoy (you can read about them here). Following is a brief synopsis of the report:

CIO’s Top Priorities

Portals. Yes, portals ranked as the highest priority of all projects, primarily because they have the most visibility and represent how companies experience the effects of automation and efforts to improve infrastructures.

Hybrid Cloud Architecture

Dell EMC discovered that over the last three (3) years more and more companies (from 65% to 84%) want to utilize a hybrid cloud architecture to support production apps. However, the key word is “want”. They’re not there yet—currently only ten percent (10%) of the CIOs interviewed were using this hybrid cloud approach.

DevOps

Sixty-seven percent (67%) of the CIOs want DevOps to be an integral part of the organization, and all would like to get there, and are planning on it, within eighteen (18) months. Here’s the biggest reason—it currently takes them at least six (6) months, on average, to deploy a new release.

Network Virtualization

While it doesn’t trump Hybrid Cloud or DevOps in importance, it is where the CIOs believe they have the largest infrastructural gap. On average, they would like at least forty percent (40%) of infrastructure to be virtualized within the next 12 to 18 months.

Automation

While Infrastructure as Code (IaC) and automating changes and management ranked high on their priority list (90% wanted it), only 5% claimed to already be there.

Over eighty-five percent (85%) of the CIOs interviewed wanted to do a better job of proactively, and promptly, addressing performance and capacity issues through automation and alerts. And they would like to have automated metering and an automated analytics engine in place that delivers metrics and trends for all IT services.

Top Challenges

Changing the current operations model is the hardest part of transforming a new approach to IT. When lined up in a row—service delivery transformation, new infrastructure deployment and management, and how the IT organization needs to be re-structured—IT transformation becomes an even more daunting task than first imagined.

What are the most prevalent changes made by the top performers?

According to the CIOs interviewed by Dell EMC, the top performers—those in the top twenty percent (20%) of targeted goal achievement—had already achieved the following:

  • Executive-level, top-down support of a documented strategy and roadmap for IT Transformation.
  • IT resource provisioning taking no more than one (1) week.
  • The utilization of Cloud-based Platform-as-a-Service (PaaS).
  • The virtualization of almost one hundred percent (100%) of their infrastructure.
  • Automation implemented to deliver IT services.

Call on the experts

IT Transformation, like Digital Transformation, is no mean feat. It involves organizational changes, and lots of them, all while keeping up with technical advances across a wide range of disciplines. That’s why talking to professionals who’ve helped companies automate processes to enhance operations and grow their bottom line should be a key element of your technology roadmap.

GDT’s tenured, talented solutions architects, engineers and security analysts understand how to positively incorporate change by designing and deploying innovative solutions that help customers realize greater productivity, enhanced operations and more revenue. GDT utilizes key partnerships with best-of-breed technology companies, like Dell EMC, to help organizations transform their legacy environments into highly productive digital infrastructures and architectures. You can reach them at SolutionsArchitects@GDT.com or at Engineering @GDT.com. They’d love to hear from you.

Want more information about IT and Digital Transformation? Check them out here:

Unwrapping DevOps

Autonomics and Automation–is there a difference?

Answer: you get a solution better than your current one

A-M-D-I-L-L: Unscrambled, these letters represent some of the hottest topics in the IT Industry

A Robust Solution for the entry-level storage customer

Don’t put off ’til tomorrow what you must do today

Want to read about a cool, real-world Blockchain application?

When being disruptive is a good thing

Election results are in, but there’s one (1) tally that remains to be counted

By Richard Arneson

The midterms are over, most of the concession speeches have been grudgingly made, which thankfully means no more yard signs or unwanted texts from candidates, and no longer having to hear “This message has been approved by…” at the end of competitor-bashing ads. But now it’s time to find out how the elections “really” went―as in how well did preparations to quash election-related cyber threats work. While some might roll their eyes at the subject of election-related cyber threats, especially as they relate to the 2016 Presidential election, they’re a very real issue and pose a considerable threat. Even the smallest, seemingly mildest of attacks could sway the results in a tight race. Some politicians have already made claims pointing to election-swaying cyber threats, and that was on Monday, prior to the election—Brian Kemp, a Georgia Republican, was the first to make clear his belief that this election has already received its fair share of hacking.

The federal government has had two (2) years to fortify its cyber security position, and Monday night it issued a joint statement endorsed by the DOJ, the FBI, the Department of Homeland Security and the Office of the Director of National Intelligence in an attempt to allay voters’ fears: “Our agencies have been making preparations for nearly two years in advance of these elections and are closely engaged with officials on the ground to help them ensure the voting process is secure.” Department of Homeland Security Secretary Kirstjen Nielsen confidently stated that these elections will be “the most secure election we’ve ever had.” Hopefully, she’s correct.

When just the threat becomes the biggest threat

One (1) of the many concerns surrounding the election had nothing to do with actual threats, but with the voting public’s fear of them. Election officials were concerned that these fears could dissuade people from heading to the polls. The federal government focused much of its efforts on helping local election officials better secure voting infrastructures, primarily as they pertain to social media’s obvious effect on elections. Their tactic? To use the medium to disseminate correct and accurate information, and to persuade voters to hit the polls without fearing their vote won’t count. For instance, addressed misinformation spread via Twitter that U.S. Immigration and Customers Enforcement officials would be stationed at polling places to check voters’ immigration status. Twitter and Facebook attempted to head this off by scanning for such content, then wiping it clean. On Monday, Facebook announced that it had taken down well over a hundred accounts for this reason. In addition, they assembled a staff to work ‘round the clock to address and respond to allegations related to voter fraud.

Social media companies have been scanning websites that publish polling locations to help ensure accurate information was being distributed. Bad info means disgruntled voters, which could mean heading home without casting a ballot. This issue was detected on websites located in Florida, California and Georgia. In addition to its aforementioned statement, the federal government urged voters to check and verify the source of polling location information.

While several cybersecurity firms and election officials haven’t seen cyber attacks on any systems related to the actual voting process, they have reported seeing an increase in attacks on infrastructures and websites. There have been reports that votes cast for Texas Democratic Senate candidate Beto O’Rourke were automatically switched to Republican incumbent Ted Cruz. While they believe this is a case of voter, or user, error, officials have posted signs to remind voters to carefully inspect their voting summaries prior to submitting them. In the oft chance that it wasn’t user error, officials were quick to point out that it has nothing to do with a cyber attack. This hyper focus on a fair and impartial election has shifted the focus by many on the need to have a paper trail, and that electronic voting should be a thing of the past. But while paper trails would certainly be a good thing, having them wouldn’t help battle cyber attackers who desperately want to influence elections.

Here’s the good news…and a little bad

The election process in the U.S. is widely dispersed—there’s no vast, single system to infect. That’s a good thing. However, this dispersion makes it harder to detect malicious activities, as officials in one area may not be as diligent, or possess the threat-detecting skill sets, as those in others. And if one is discovered, it would have to be effectively communicated to tens of thousands of polling officials scattered throughout the United States.

The Key is Coordination

The overall success of the federal government’s fight against election-related cyber threats will depend on how well they communicate with local election officials and the social media companies. And while they’ve undoubtedly made headway in the quashing of threats, we may never know just how much this election was, and future ones will be, affected.

Questions about cyber threats and IT security?

If you have a security-related questions, and who doesn’t in the IT industry, contact the security experts and analysts at GDT. They manage GDT’s 24x7x365 Security Operations Center (SOC), from which they oversee network security for many of the most noted enterprise organizations and government entities agencies in the world. Contact them today at soc@gdt.com. They’d love to hear from you.

 

Read more about network security here:

Hiring a hacker probably shouldn’t be part of your business plan

Phishing is up, and you should probably let your college-age kids know about it

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

Rx for IT departments—a security check-up

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

Just when you thought it couldn’t get any better

By Richard Arneson

Who would have thought that by twisting fiber optics you could get speeds up to 100 times faster? Not me, but apparently engineers and researchers at Melbourne, Australia’s RMIT University thought it would. They wondered whether fiber, if twisted—or at least the light within it was twisted—could create another dimension, a third dimension, to carry 1’s and 0’s faster. Spoiler alert—it can.

Actually, it was U.S.-based researchers who first discovered that light could be “twisted”, but it was the ones from down under who first created a reasonable-sized detector to read the transmitted information. The first detector was the size of a Mini Cooper, but the chip has been whittled down considerably—it’s now the width of a single strand of human hair.

While it’s unclear if the U.S. researchers were inspired by the same vision, those at RMIT got the idea by looking at the double helix spiral in DNA. Both sets of researchers, however, pondered about the same thing—would increasing spirally light’s momentum (it’s called orbital angular momentum) could enhance transmission speeds?

The promise that twisted fiber holds is staggering. According to one (1) researcher who worked on and was eminently involved with the project, the amount of turns could be, theoretically, infinite.

Boost efficiency, upgrade easily

With fiber optics, pulses of light transmit the data, but the information can only be maintained through the light’s color, and it depends on whether it’s carried in a vertical or horizontal fashion. Twisting the light simply creates more travel space for the information within the fiber. This third dimension is like having a brand new HOV lane appear on the freeway overnight. And remember, twisting the fiber is a misnomer. It’s the light that’s twisted, which means today’s fiber optic networks could remain in place and untouched, whether buried or strung aerially. Each turn of the light produces a unique value, and the more turns, the faster the transmission speed.

Oh, and in case you’re champing at the bit wondering what the schematic structure of twisted light looks like, here it is…enjoy:

 

 

 

 

 

 

If you want to read RMIT’s research report—and see other formulas like this—it’s published in the journal Nature Communications.

If you have questions or would like more information about fiber optics or optical networking, contact GDT’s Optical Networking practice professionals at Optical@gdt.com. Composed of experienced optical engineers, solutions architects and project managers who specialize in optical networks, the GDT Optical Networking team supports some of the largest service providers and enterprises in the world. They’d love to hear from you.

For additional reading about fiber optics, check these out: 

A fiber optic first

When good fiber goes bad

Busting the myths about fiber optics

 

And read about how GDT’s Optical Networking team helped:

A global social networking company get some much-needed redundancy

In the world of networking, we all want five 9’s…here’s another way to get them

By Richard Arneson

We’ve all done it; you may do it every day. You pick up the phone to dial up service of some type, and tap into a call center. You may not even know you’ve entered that world, but you can usually tell. You’ve typed in your phone number, maybe your account and address, and a friendly voice answers you by name and asks how they can be of service. When it goes well, great. But when it doesn’t—which sadly isn’t unusual—it can ruin your day, or at least the next few hours of it, and leave you wondering what happened to customer service.

Now imagine overseeing that call center. You’ve been tasked with a vital component of your company’s success—customer satisfaction. With poor service comes rants on Yelp and social media, and with that comes dissatisfied customers who may never spend money with your company again.

Managing a call center is a big responsibility, whether it numbers ten or thousands of agents. Selecting the right call center platform provider is critically important to its success.

Over 17, more than 2,000, and in excess of 3 billion

Those numbers represent the years of cloud contact experience, the customers worldwide and the annual customer interactions, respectively, to which Five9 can lay claim. They provide end-to-end, cloud-based call center solutions that increase agents’ productivity by utilizing omnichannel routing, analytics, WFO (workforce optimization) and AI (artificial intelligence).

A new aaS to remember…

CCaaS (Contact Center as a Service) pushes the customer experience to the Cloud by utilizing a contact center solutions provider’s software. And Five9 is pretty good at it—actually phenomenal–and here’s proof: for four (4) years running, Five9 has been named a leader in the Gartner Magic Quadrant for CCaaS (Contact Center as a Service). They are one (1) of three (3) companies considered by Gartner as leaders in this space—the rest of the pack is playing catch-up.

Here’s how they do it

Five9 Cloud Contact Center not only provides call centers with the software needed, but they handle the phone service, too. With it, you can track and manage all incoming and outgoing customer interactions. But those interactions aren’t merely limited to phone calls. They include emails, chats, social media, and more.

It’s one hundred percent (100%) cloud-based, so there are no upfront or ongoing investments in equipment or infrastructure. Here’s how it works for agents—connect to the Internet, log in and pop on a headset. It’s that easy. Everything that’s offered through costly, maintenance-heavy on-prem systems can be achieved with Five9. Pricing is based on usage, seats and features, and customers can utilize annual or month-to-month plans.

The days of designing, building, managing, monitoring, integrating and maintaining an on-prem call center solution can soon become little more than a “the way things used to be” memory.

Reporting

A key element of call center management relates to reporting, including the recording of agent interactions. Five9 provides real-time and historical reporting. You can monitor precisely how busy your call center is at any given time, including the length of calls. And historical reporting enables Five9 customers to easily customize reports that will address their specific needs, and they can be automatically pushed out to management at a time of their choosing.

Omnichannel

With Five9 Omnichannel, you can bridge the divide between different types of communication channels, whether voice or digital, with a unified and intuitive desktop. And with their intelligent routing feature, you’ll know that regardless of the channel utilized, interactions will be delivered to the right resource each and every time. And all reporting and recording features, including workforce management (WFM), quality monitoring and integration with your CRM, are omnichannel-ready.

Reliable, secure, compliant and scalable, all of which, when combined, create a great experience for your customers—that’s Five9.

Use your resources—call on the experts

Talking to professionals who’ve helped companies enhance their operations and grow their bottom line should be a key element of your technology road map. GDT’s tenured, talented solutions architects, engineers and security analysts understand how to positively incorporate change by designing and deploying innovative solutions that help customers realize greater productivity, enhanced operations and more revenue. GDT helps organizations transform their legacy environments into highly productive digital infrastructures and architectures. You can reach them at SolutionsArchitects@GDT.com or at Engineering @GDT.com. They’d love to hear from you.

Want more information about Digital Transformation and today’s technology landscape? Read about them here:

What is Digital Transformation?

Unwrapping DevOps

Autonomics and Automation–is there a difference?

Answer: you get a solution better than your current one

A-M-D-I-L-L: Unscrambled, these letters represent some of the hottest topics in the IT Industry

A Robust Solution for the entry-level storage customer

Don’t put off ’til tomorrow what you must do today

Want to read about a cool, real-world Blockchain application?

When being disruptive is a good thing

Hiring a hacker probably shouldn’t be part of your business plan

By Richard Arneson

In just nine (9) short years, ridesharing company Uber has risen from a small, San Francisco-based startup to a highly disruptive, $6.5 billion juggernaut that, along with its competitor Lyft, has given over 2 million people with a car and spare time on their hands the opportunity to earn a little extra cash while shuttling riders around their fair city. But with precipitous growth often comes pain. In Uber’s case, the pain comes in the form of a FTC-mandated $148 million settlement payment resulting from the 2016 decision to cover up a security breach by co-founder and erstwhile CEO Travis Kalanick.

It’s unclear whether Kalanick knew about the plan ahead of time, but, regardless, Uber addressed the data breach that exposed the names and driver’s license numbers of over six hundred thousand (600,000) drivers and another fifty-five million (55,000,000) riders in an odd way. They hired a hacker.

The Breach

In 2016, attackers accessed Github, a site utilized by software engineers, to somehow obtain Uber’s credentials for their AWS account. Once in, the intruders secured unencrypted information about their drivers and riders, including email addresses, phone numbers and driver’s licenses. But this wasn’t Uber’s first security breach rodeo. Two (2) years earlier, in 2014, a similar breach resulted in FTC-mandated sanctions. It’s believed that the 2014 incident is what led several at Uber to decide that handling the latest breach on its own, without public disclosure, sounded like a good plan. It wasn’t. And it’s why they had to write the $148,000,000.00 check made payable to the FTC.

The Uber Bug Bounty Program

Forty-eight (48) states have some type of legislation that requires companies to reveal to consumers that a data breach has occurred. While Uber eventually got around to telling the public, they did so after first trying to repair the damage with this, their half-baked plan—they paid a hacker $100,000 through Uber’s bug bounty program, which rewards any hacker who discovers and discloses software flaws. Oh, boy.

In this case, the hacker-for-hire’s job was to delete the affected data, sign a nondisclosure agreement to keep mum, and collect a cool hundred grand. The incident wasn’t reported until a year later by new CEO Dara Khosrowshahi, who declared the handling of the incident a failure, then fired two (2) employees who had signed off on the $100k payment.

After an investigation by state attorneys general determined that Uber had violated data breach notification laws, the FTC then conducted their investigation, which concluded in April of this year.

“After misleading consumers about its privacy and security practices, Uber compounded its misconduct,” said acting FTC chairman Maureen Ohlhausen. She announced that this new agreement with Uber is “designed to ensure that Uber does not engage in similar misconduct in the future.”

As a result of the FTC’s investigation, Uber will have to submit to regular privacy audits for the next twenty (20) years. And if they fail to notify the FTC of any security breaches in the future, or if they engage in or provide misleading information about how they monitor access to consumers’ personal information, they could face significant civil penalties, ones that’ll make $148 million look like the change you find between the sofa cushions.

Got questions? Call on the Security experts

To find out more about the many threats that may soon target, or are currently targeting, your organization, contact GDT’s tenured and talented security analysts at SOC@GDT.com. From their Security- and Network Operations Centers, they manage, monitor and protect the networks of some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

 

Read more about network security here:

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

Rx for IT departments—a security check-up

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

Without Application Performance Monitoring, your IoT goals may be MIA

By Richard Arneson

Performance monitoring. It’s a pretty generic term. It’s like Lots of moving parts, Best practices, Thinking outside the box….we could list them until Boston releases its 3rd album (for the over 50-year-old reader). But in the IoT world we live in, performance monitoring is a critical component of whether the technology delivers on its promises, or results in little more than a nuisance to your IT staff. And a critical component of a technology that is estimated to exceed $1.3 trillion in spending within the next three (3) years makes it pretty critical.

Whether it’s a wearable, a thermostat, a doorbell or an appliance, IoT is creeping into all facets of our everyday lives. It has already had a tremendous impact on society, both here and abroad, and, while it may be hard to imagine how it will continue to affect our lives, bet on this:  it will, and for a long time.

Here’s the IoT conundrum

With the number of new devices on the market and its precipitous growth, management of IoT devices is unwieldy and requires considerable man hours.

IoT is a little like Austin, Texas. Everybody loves it and thinks it’s cool, but its popularity and perceived hipness caused it to outgrow its infrastructure. When Austin was only home to the University of Texas and state government, driving from one end to the other was a piece of cake, a 10-minute trip. Then the growth came, and came, and came, and the number of businesses and transplanted workers burst its infrastructure at the seams. Have you tried driving around Austin lately? The Let’s Keep Austin Weird slogan should be changed to Let’s keep Austin under 3 million people.

IoT is busting out of its seams. IT departments are having trouble managing it, as personnel are spending inordinate amounts of time trying to locate the sources of issues, not to mention how to fix them. Exponentially more amounts of data are traversing the network, making capacity planning and the management of it more complex and daunting. And with IoT growth comes the need for more robust hardware, complex applications and ecosystems, new protocols and security requirements, additional backup needs, and better, faster transmission technologies. IT departments, like Austin, are having trouble keeping up.

How to combat the conundrum

AppDynamics is a 10-year-old San Francisco-based application performance management (APM) and operations analytics company that was purchased by Cisco in March of 2017. Their focus? Performance and availability management across cloud environments, including the data center. Their APM product, AppDynamics APM, allows issues to be easily detected, then remediated quickly and cost-effectively. You can read about what they did for Nasdaq, the largest, by volume, U.S. stock exchange here. Think for a second what a service outage could mean to Nasdaq.

Here’s how AppDynamics APM works…

With AppDynamics’ IoT advanced performance monitoring and management tools, customers get real-time diagnostics and, more importantly, solutions to issues that pop up on the radar screen. The tools deliver exactly what’s needed in an IoT-intensive environment: smooth, simple configuration, easier deployment and provisioning, and the ongoing maintenance of all IoT-connected devices. Through predictive analytics and business intelligence, customers get alerts concerning potential problems, so they can be proactively remediated before the potential problem becomes a real one, such as a network outage.

According to the MIT Technology Review, far and away the top two (2) benefits of application performance monitoring for IoT are improved performance and security. With AppDynamics APM, those benefits—and much more—are exactly what customers get.

IOT and APM? Think GDT

While being an exciting technology, IoT only becomes beneficial to organizations if its deployment results in economies of scale, cost efficiencies, optimization of resources, better customer service and satisfaction, and, ultimately, higher revenue. And that’s why consulting with IoT professionals like those at GDT is not only critically important now, but will become more so day by day. GDT’s tenured, talented solutions architects, engineers and security analysts understand how to incorporate IoT and APM tools to help customers realize more productivity, enhanced operations and greater revenue. GDT helps organizations transform their legacy environments into highly productive digital infrastructures and architectures. You can reach them at SolutionsArchitects@GDT.com or at Engineering @GDT.com. They’d love to hear from you.

You can read more about IoT below:

How Does IoT fit in with SD-WAN?

Five (5) things to consider prior to your IoT journey

Can’t wait for 5G? The FCC has done something to speed up your wait time

Sexy, yes, but potentially dangerous

What is digital transformation?

Why companies are turning to Mobility Managed Solutions (MMS)?

Phishing is up, and you should probably let your college-aged kids know about it

By Richard Arneson

We’re entering the holiday season, which used to mean trips to the mall, circling the parking lot for a spot within a hundred yards of the door, and trying to get the clerk to accept a coupon that expired a year ago. But that’s all changed. Now it’s about trying to remember passwords on your computer, hoping your Internet provider’s network will hold up during a storm, and trying to figure out where and how to enter a coupon on Amazon’s website. But, best of all, your holiday shopping can all be done while planted in front of the TV and watching football.

But with the ease of the shopping experience comes the art and evil of phishing. And in the last twelve (12) months, phishing has tripled. Because retailers are especially vulnerable, you can count on those numbers rising in the coming months.

Think before you click

The primary reason retailers are so susceptible is because their customers (yes, all of us) are ripe for the picking, at least those ones who don’t carefully inspect the origin of emails or the URLs they visit after clicking on any imbedded links. For instance, you might think you’ve gone to Walmart, but upon careful inspection the URL may be Walmart.us.com, but the copy will look like the real thing. All they want is for you to make that one (1) simple purchase, then enter your credit card info. Once they have that, it’s holiday time for the scammers. They’ll enter the real site, order goods with your credit card and, after you’ve disputed the purchase, the retailer credits your account and the bad guys get the merchandise.

The newest targets

One of the many reasons phishing is up is due to their newest target demographic—younger consumers whose first credit card may be burning a hole in their wallet. And where do you go to phish for new victims? Yep, the sites they use on a regular basis. Now many retailers are selling their products through alternate channels, like SnapChat and Facebook. In fact, Instagram has been the phisher’s favorite new vehicle of choice due to its relative infancy in that marketplace.

A recent study on the number of fraudulent retail websites found that there are three times (3X) more of them than there were a year ago. Why? Because phishing is working especially well on the aforementioned sites. When scammers send out millions of emails, their odds are pretty good that at least a few will haphazardly fall for it. And that’s all they need.

Quick Tips for protecting yourself against Phishing

  • Utilize an anti-virus product that is capable of detecting fraudulent and malicious websites, or what they may refer to as anti-phishing technology in their marketing materials.
  • Type in the URL of the retailer’s website. This will ensure you’re heading to the right place. I know, it’s easier to click on the link, but typing it in will only cost you a few additional seconds.
  • If you’re ever questioning a site’s authenticity, type in a fake password. If it’s accepted, trouble’s lurking—they’ll accept anything for the password. Close it out and delete your browsing history.
  • Also, regularly inspect your credit card and bank statements. It’s not fun reading or an activity you’ll look forward to, but careful inspection is one (1) of the best medicines.
  • When you see all CAPS in the subject line, you’ve probably received a phishing email. Why scammers like ALL CAPS is unclear, but it’s a common practice.
  • Check that the e-commerce site you’re visiting begins with https://, not http://. The S is for Secure, meaning all communications between you and the website are encrypted.
  • Look for misspelled words or really, really poor grammar. You won’t need an English degree to spot it—it’ll dramatically stand out.
  • If you’ve entered a site and the images are of poor quality or low resolution, you’re probably on a fraudulent site. You won’t see butchered images on the websites of reputable retailers.

Most want to get their holiday shopping done as quickly as possible, especially at a time when football is heading into the postseason. But taking a little extra time and care prior to opening an email or navigating a website will help make the holiday season a more enjoyable and less stressful affair.

Got question? Call on the Security experts

To find out more about phishing, cybersecurity and the many threats that may soon target, or are currently targeting, your organization, contact GDT’s tenured and talented security analysts at SOC@GDT.com. From their Security- and Network Operations Centers, they manage, monitor and protect the networks of some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

 

Read more about network security here:

Gen V

Sexy, yes, but potentially dangerous

Tetration—you should know its meaning

It’s in their DNA

Rx for IT departments—a security check-up

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

When implementing a new technology, don’t forget this word

By Richard Arneson

“Change is the only constant.” — Heraclitus, Greek philosopher

 

Change—it’s a word that’s dreaded by many, feared by most and embraced by few. But even if you fall into the latter, you can’t deny there’s at least some level of trepidation prior to entering the world of the unknown. Change can be so daunting it has become its own cottage industry—there are over 70,000 book titles on Amazon.com that address the subject, roughly the same as weight loss.

But using the word “change” to describe your company’s use of a new technology to help support its digital transformation goals seems inadequate. Sure, it can mean new processes to learn, different reports to analyze, loads of new data from which to glean information, and unfamiliar dashboards to navigate, but it’s far more than change. Remember, we’re talking transformation, as in jumping into the digital evolution stream feet first and letting it carry you downriver to higher profits. In a recent Gartner study, well over fifty percent (50%) of CEOs cited improved profitability as their ultimate goal when implementing new technologies and achieving digital transformation.

When implementing a new technology—or multiple ones—consider the following to help allay the change-based fears that will certainly exist within your organization.

Transparency is the best policy

Share with employees why these new technologies are being implemented and how, even though change isn’t easy, they will ultimately result in an enhanced working environment and a more profitable company. But don’t just tell them, show them. Provide concrete examples of organizations who have enjoyed the success you’ll soon achieve. If you’re not transparent with your organization, the imminent changes won’t be met with very open arms. A lack of transparency will result in negativity that will spread throughout your organization at breakneck speed. You’re trying to motivate them to want, not dread, the coming changes.

Openly share your digital transformation roadmap

Once employees know what they’ll soon enjoy from the introduction of any new technology, share with them exactly how you’ll get there and the importance that each employee will play in the transformation. Leave nothing to interpretation, but ensure the roadmap is realistic and attainable. Springing on them your goal to go from 0 to 60 in less than a couple of seconds will quickly steer your organization down the road to anxiety. And remember, this is a very good job market. If your employees are overwhelmed, calls from recruiters will be better received and possibly given greater consideration. And once the roadmap is shared, solicit feedback. Make employees feel like their concerns aren’t falling on deaf ears.

Set up a system of support

Introduce detailed training plans for each department and customize them accordingly. A training or video series for the marketing department shouldn’t look the same as ones for IT—let’s face it, they speak different languages. If employees feel that their time is being wasted by sitting through training that may provide little or no help as it relates to their position, cynicism will begin to seep in and the negativity will soon flow throughout their departments.

Have champions from each department carry the banner for a new technology, and enlist them to help customize training for their respective department. Not customizing it may speak volumes, whether true or not, about how much care and consideration a new technology or the entire digital transformation journey has been given.

Anticipate the problems…sorry, but there’ll be some

As part of your roadmap, include how, when snags or issues surface, you’re prepared to address them promptly and comprehensively. Let them know that the journey will be escorted by procedures, practices and personnel who will remediate issues that surface.

And problems aren’t only reserved for break-downs or overseen issues. They should include how changes to the roadmap will be addressed. First drafts are never perfect out of the shoot. There are issues you’ll face, even if it’s impossible to see them ahead of time. Get ready…there will be a need at some point for the roadmap to be adjusted or altered.

Ease into it

First time skiers don’t start on the black slopes. The same goes for the adoption of new technologies. Start on the green slopes of the roll-out phases. Learn from those, let employees see how it’s done and that it can be done. Show them that the roadmap is leading your organization in the right direction. Then move to the blue slopes, and on to the black. And once a phase is successfully implemented, don’t haphazardly move on to the next. If it’s possible, budget time to let them learn the new system, the tools, reporting, et al. Let them get used to it, get their hands a little dirty, and take a few breaths prior to moving to the next phase.

Call on the experts

Change is never easy, but leaving it up to chance can devastate morale, create unnecessary personnel attrition and adversely affect your bottom line. That’s why working with a company that’s helped many organizations of all sizes and from a variety of industries achieve their digital transformation goals should be a key element of your technology roadmap. GDT’s tenured, talented solutions architects, engineers and security analysts understand how to positively incorporate change by designing and deploying innovative solutions that help customers realize positive business outcomes. By being customer- and outcome- focused, GDT helps organizations transform their legacy environments into highly productive digital infrastructures and architectures. You can reach them at SolutionsArchitects@GDT.com or at Engineering @GDT.com. They’d love to hear from you.


Want more information about new technologies and digital transformation?

What is Digital Transformation?

The 6 Rs of a Cloud Migration

Calculating the costs–soft and hard–of a cloud migration

Security Concerns about SD-WAN?

A Robust Solution for the entry-level storage customer

Don’t put off ’til tomorrow what you must do today

Want to read about a cool, real-world Blockchain application?

When being disruptive is a good thing

Enough with the Aggie jokes—Texas A&M’s new initiative to combat cyber threats is nothing to laugh about

By Richard Arneson

Some things just don’t make sense, like why when a baseball hits the foul pole it’s a fair ball. Shouldn’t it be called the fair pole? Or why hot dogs come in packs of ten (10) but the buns in quantities of eight (8). Oh, and how about this one—its estimated that within the next three (3) years almost 4 million (4,000,000) cybersecurity jobs will go unfilled due to both a lack of interest and adequate training. It doesn’t seem possible given the amount of cybersecurity events that we hear about every week, what with the ransomware, the Trojans, the viruses, the malware, etc. You’d think cybersecurity would be attracting professionals in droves, but it isn’t. Texas A&M University is doing something about it, though.

While many of the larger corporations have enacted specialized apprenticeship programs in cybersecurity, including mobile training trucks for personnel, the Fightin’ Texas Aggies have taken a far more proactive approach to this issue, and it’s one from which they’re immediately benefiting. To address their cybersecurity labor shortage, they’re pairing students with AI software to protect the schools’ systems from cyber-attacks. In turn, the students get security training and a great, hands-on addition to the resume.

Each month, the Texas A&M University System, which includes eleven (11) universities and seven (7) state agencies, estimates that there are approximately a million attempts to hack into their systems. Prior to implementing this program, IT security was handled by a lean staff that included few full-time employees. Now ten (10) students comprise the majority of their IT security team, and they’re utilizing AI software to detect, monitor and remedy threats. And they’re having no trouble filling these positions. Word has spread throughout campus that this high-visibility program provides insightful skill sets and extremely marketable training.

Nothing beats on-the-job experience

The students’ first order of business each day is to study a whiteboard that outlines areas within the university system that have, or are currently facing, a threat. The threats are compiled through AI, which also prioritizes each. Then it’s up to the students to analyze any abnormalities and determine if it appears suspicious by comparing them to prior attacks.

AI software is key to this initiative, serving as a great springboard for inexperienced cybersecurity students by allowing them to evaluate threats immediately. While AI isn’t acting on the threats—which some consider a risky proposition in the first place–it’s left up to the students to remediate the issues.

So why the lack of professionals in cybersecurity?

Almost fifty percent (50%) of security executives recently surveyed by ISSA (Information Systems Security Association) said that this glaring lack of security professionals is due to two (2) things—high turnover and the high rate of job burnout. And while Texas A&M’s SOC (Security Operations Center) isn’t immune to either, they’re attempting to address these issues by throwing numbers at the issue in the form of many students who are looking for an opportunity to work there. And due to these numbers, students are able to spend time training or working on side projects that can be great additions to their resume. Gig ’em.

Got question? Call on the Security experts

To find out more about cybersecurity and the threats that may soon target your organization, contact GDT’s tenured and talented security analysts at SOC@GDT.com. From their Security- and Network Operations Centers, they manage, monitor and protect the networks of some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you.

 

Read more about network security here:

Gen V

Sexy, yes, but potentially dangerous

Tetration—do you know its meaning?

It’s in their DNA

Rx for IT departments—a security check-up

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

Can’t wait for 5G? The FCC has done something to speed up your wait time

By Richard Arneson

Whether you’re a dyed-in-the-wool technophile or just one of those people who has to be the first to have the latest gizmo or gadget, you’re probably eagerly anticipating 5G, which will provide for consumers a host of benefits, including faster speeds, lower latency and a more IoT-friendly wireless infrastructure. But when you hear that 5G won’t be fully deployed for another four (4) years, it kinda’ ruins the mood. Unfortunately, service providers can’t roll out 5G—or any G, for that matter—all at once. Think of the cell towers that need to be upgraded from coast to coast—it’d take almost half a million technicians working simultaneously to accomplish this feat in one fell swoop. Yes, the rollout will begin within the next couple of months, but if you’re not in one (1) of the lucky roll-out areas, you’ll have to wait…and wait…and potentially wait another four (4) years.

…to the rescue

The Federal Communications Commission (FCC) wants to do something about that waiting. And they have. On August 2nd, they voted on rules to speed up rollouts of not just 5G, but new networks, as well. These rules are known as One Touch Make Ready (OTMR), a non-descriptive abbreviation that addresses the strict, cumbersome laws in place that specify the required distance that must separate network elements attached to a pole—usually a telephone poll.

When either a new service provider enters a market, or if an existing one (1) would like to address poor connectivity in an area by adding a site, any equipment or wires already attached must be reconfigured to ensure the required distance is maintained. It’s so painful that many speculate it’s the very reason Google Fiber had to greatly throttle back its once aggressive deployment schedule.

Currently, laws related to cell towers are handled through the jurisdiction in which they reside. Resultant installations are a headache at best, a nightmare at worst, and pole access to new competitors is delegated to “least important” status. Because accommodating new competitors is reliant on the reconfiguration of equipment and wiring by incumbent carriers, the process is, as you probably imagined, not one of their higher priorities.

According to FCC Chairman Ajit Pai: “For a competitive entrant, especially a small company, breaking into the market can be hard, if not impossible, if your business plan relies on other entities to make room for you on those poles. Today, a broadband provider that wants to attach fiber or other equipment to a pole first must wait for, and pay for, each existing attacher [installer] to sequentially move existing equipment and wires. This can take months. And the bill for multiple truck rolls adds up. For companies of any size, pole attachment problems represent one of the biggest barriers to broadband deployment.”

In addition to 5G, the FCC believes this new rule will mean 8.3 million additional premises will be passed with fiber, totaling in excess of $12.6 billion spent on those projects. In addition to faster installations of cell sites, the new rules will greatly enhance the fiber density related to wireless backhaul.

Mobility Experts with answers

If you have questions about your organization’s current mobility strategy (or the one you’d like to implement) and how 5G will affect it, contact GDT’s Mobility Solutions experts at Mobility_Team@gdt.com. They’re comprised of experienced solutions architects and engineers who have implemented mobility solutions for some of the largest organizations in the world. They’d love to hear from you.

Usually just a minor annoyance, the Flash Player update can now result in a major ordeal

By Richard Arneson

It’s one (1) of the most common speed bumps on the Internet highway—the Adobe Flash Player update message. It’s unexpected and never welcome—a little like a tornado, but not quite that bad. It may not trump some of the other digital speed bumps, like the Windows update you have to sit through after you’ve hit “Shut Down” on your computer (you know, the one that usually occurs at 5:30 on Friday afternoon), but it still serves as one (1) of computing’s many figurative mosquitoes. But while the Flash update has only proven to be a minor annoyance, you can now place it in another category―crippling.

Palo Alto Networks, the Santa Clara, CA-based cybersecurity firm, discovered earlier this month that a fake Flash updater has been loading malware on networks since early August. Here’s the interesting part—it actually installs a legitimate Flash update. But before you think cyber attackers have going soft, they’re downloading Flash for distraction purposes only. And while the update is taking place, another upload is occurring—the installation of a bot named XMRig, which mines a cryptocurrency named Monero. Once the install(s) are complete, the user, unbeknownst to them, begins mining Monero. And there you have it—cryptojacking.

Cryptojacking with XMRig

Once the phony Flash update is launched, the user is directed to a fake URL that, of course, isn’t connected to an Adobe server. After the Flash update is installed, XMRig accesses a Monero mining pool—and the fun begins. XMRig begins mining Monero from infected, networked computers as unknowing users merrily work along, completing their day-to-day tasks. Keep in mind that Monero is a legitimate form of cryptocurrency. Like Bitcoin for ransomware, Monero is the cryptocurrency of choice for cryptojacking. Monero’s website claims it is “the leading cryptocurrency with a focus on private and censorship-resistant transactions.” (Unlike Bitcoin, Monero doesn’t require the recipient to disclose their wallet address to receive payment(s)).

Let’s back up a bit—here’s how crypto mining works

It can be argued that cryptojacking has replaced ransomware as cyberattackers’ malevolent deed of choice. It’s important to remember, though, that cryptocurrency mining is legal—it’s how cryptocurrency works. Mining is the process of finding, then adding transactions to, currencies’ public ledger. The chain of transactions is called the block—hence the name blockchain.

A blockchain’s ledger isn’t housed in one (1) centralized location. Instead, it is simultaneously managed through duplicate databases across a network of computers—millions of them. Encryption controls and protects the creation of new coins and the transfer of funds, without disclosing ownership. The transactions enter circulation through mining, which basically turns computing resources into coins. Anybody can mine cryptocurrency by downloading open-source mining software, which allows their computer to mine, or account for, the currency. Mining solves a mathematical problem associated with each transaction, which verifies that the sender’s account can cover the payment, determines to which wallet the payment should be made, and updates the all-important ledger. The first one to solve the problem gets paid a commission in the particular currency it’s mining.

In cryptocurrency’s nascency, the computing power needed was minimal. Basically, anybody could do it. Now the computing power needed to mine cryptocurrency is considerable, with miners requiring expensive, purpose-built, super powerful computers to do so. If they don’t have that, they can forget making decent miner money. But building enough computing resources needed to profitably mine cryptocurrency today is expensive, often cost prohibitive. In cryptojacking, however, the cyber attackers network together infected computers and utilize their computing power without spending a dime. In turn, the victim’s infected computer is busy surreptitiously mining cryptocurrency and slowing to a crawl. The bad guys enjoy pure net revenue.

Got question? Call on the Security experts

To find out more about cryptojacking, ransomware, malware, Trojans, and the host of security-related issues your organization needs to consider and fend off, contact GDT’s tenured and talented security analysts at SOC@GDT.com. From their Security- and Network Operations Centers, they manage, monitor and protect the networks of some of the most notable enterprises, service providers, healthcare organizations and government agencies in the world. They’d love to hear from you

Get more information about network security here:

Gen V

Sexy, yes, but potentially dangerous

Tetration—do you know its meaning?

It’s in their DNA

Rx for IT departments—a security check-up

When SOC plays second fiddle to NOC, you could be in for an expensive tune

How to protect against Ransomware

 

Hybrid Cloud Conundrums? Consider HPE GreenLake Flex Cap

By Richard Arneson

If you need to purchase a container to hold what you’re estimating is between 48 and 60 ounces of liquid, are you going to buy the 50- or 70- ounce container? Yes, you’ll play it safe and get the bigger one, but you’ll spend more money and it will take up more space on the shelf. And it won’t be very satisfying, especially if you miscalculated and only had thirty-six (36) ounces to begin with. In short, you didn’t do a very good job of right-sizing your container solution. And that’s exactly what IT administrators have struggled with for years, whether it’s bandwidth, equipment or any type of technology of solution. Unfortunately, right-sizing an IT recipe usually requires a dash of hope.

Pay-as-you-go trumps the guesswork of right-sizing

HPE GreenLake Flex Capacity is a hybrid cloud solution that gives customers a public cloud experience, but with the peace of mind that often comes with on-premises deployments. It’s a pay-as-you-go solution, so right-sizing can become the dinosaur of high-tech industry. HPE GreenLake Flex Cap provides capacity on-demand and scales quickly to meet growth needs, but without the wait times–often long ones–that come with circuit provisioning.

And it gets better―management is greatly simplified; customers can manage all their cloud resources, and in the environment of their choosing. HPE GreenLake customers enjoy:

  • Limited risk by maintaining certain workloads on-prem
  • Better and more accurate alignment of cash flows, no upfront costs and a pay-as-you-go model
  • Savings by no longer wasting dollars on circuit overprovisioning
  • Immediate scalability to address the needs of your network
  • Real-time failure alerts with remediation recommendations
  • The ability to perfectly size capacity

And with these integrated, turnkey packages, your organization can enjoy HPE GreenLake Flex Cap even faster

 

GreenLake for Microsoft Azure or Amazon Web Services (AWS)

Whether you’re utilizing Microsoft Azure or Amazon Web Services (AWS) for your cloud environment, GreenLake Flex Cap can provide turnkey controls for performance, compliance and costs.

GreenLake for SAP HANA

SAP HANA customers can enjoy a fully managed, on-prem appliance with right-sized SAP®-certified hardware and services to satisfy workload performance and availability. As the leading supplier of SAP infrastructure, HPE GreenLake for SAP HANA delivers the performance, control and security needed for the most demanding of mission-critical applications.

GreenLake for Big Data

GreenLake for Big Data accelerates time-to-value with asymmetric or symmetrical configurations, and there are no security issues or risks associated with repatriation once datasets are shipped to third-party data centers.

GreenLake for EDB Postgres

Reduce TCO and simplify operations with this Oracle-compatible1 open-source database platform. Your teams will be able to better focus on applications and insights that will drive business outcomes.

GreenLake for Backup

Pay for exactly what you back up. Yes, it’s that simple. GreenLake for backup includes Commvault software that’s pre-integrated on your choice of HPE StoreOnce or HPE 3PAR Storage.

Now combine GreenLake with HPE Pointnext

HPE Pointnext can not only monitor and manage the entire solution, but it provides customers with a portal that delivers key analytics and detailed consumption metrics.

Questions? Call on the experts

If you have additional questions or need more information about HPE GreenLake Flex Capacity and the many benefits it can provide your IT organization, contact one of the talented and tenured solutions architects or engineers at GDT. They can be reached at SolutionsArchitects@GDT.com or at Engineering@GDT.com. They’d love to hear from you.

Answer: You get a solution better than your current one

By Richard Arneson

Question: What happens when you combine AI (artificial intelligence) and Wi-Fi? Apologies to Alex Trebek and Jeopardy, but this particular solution is so cool, exciting and effective that I couldn’t bury the lead and had to skip straight to the answer.

Wi-Fi has been part of our lexicon and lifestyle since 2003 and, no question, it was revolutionary. Connecting your computer to the network without wires…could it get any better than that? The technology remained fairly stagnant and unchanged for several years, however. While any claims that Wi-Fi was stuck in the Dark Ages would have been a gross exaggeration, but it was beginning to feel a bit stale. And with that came dissatisfaction, user (un)friendly experiences and, ultimately, the worst adjective consumers can attach to a technology–frustrating.

It all changed in 2007, though. The launch of the iPhone, including its phenomenally successful marketing campaign, resulted in consumers snapping them up like snow cones on a hot summer day. Hello, smart device. Then came other smart devices—tablets, watches, doorbells, thermostats, et al.–which generate thirteen times (13x) more traffic than non-smart ones. And then came Mist.

Mist Systems

Based in Cupertino, CA., four-year-old Mist Systems was funded by several top investors, most notably Cisco Investments. The folks at Mist wondered why 12.6 billion smart devices worldwide were relying on a technology that wasn’t terribly, well, smart. They set out to develop a learning wireless LAN solution that would, among other features, replace time-consuming, often frustrating manual tasks with proactive automation.

Mist began with three (3) end goals in mind: Improve network reliability, transform IT services and enhance the user experience

Mist set out to fix the ills of Radio Resource Management (RRM), which manages several characteristics inherent in wireless communications, such as whether there is any co-channel interference or signal issues. The problem with RRM is that it has always been hamstrung from a lack of user insights due to poor data collection. Not so with Mist, which utilizes AI to create a Wi-Fi solution that heals itself.

Mist constantly collects, per user, RF (radio frequency) information regarding coverage, throughput, capacity and performance. The collected data is analyzed to proactively through AI make changes to enhance the user experience.

Service Level Expectations (SLEs)

Mist offers the only Wi-Fi solution to the marketplace that allows for SLEs that clients can customize based on their needs. In addition to traditional metrics, such as coverage, throughput, uptime and latency, Mist customers can set, monitor and enforce their defined SLEs, which allows them to better understand just how issues, such as jitter, packet loss and latency are adversely affecting end users.

Here’s why Mist is truly refreshing

Mist offers the only enterprise-class wireless solution that is powered by a microservices cloud architecture and doesn’t require a WLAN Controller. As a result, customers enjoy enhanced agility and scalability from an AI engine that gathers data and insight, and utilizes automation to deliver a self-healing Wi-Fi solution.

Mist introduces customers to MARVIS, their virtual network assistant built on AI, deep learning and machine learning. By using Natural Language Processing (NLP), Marvis provides IT administrators with immediate answers, so time wasted digging for them with Command Line Interfaces (CLIs) or dashboards can be better served on other tasks or projects.

Mist can lay claim to another first―they offer the only Enterprise Bluetooth Low Energy (BLE) solution that doesn’t require manual calibration. And additional beacons aren’t required; Mist developed proprietary virtual BLEs, which through a simple mouse click or API can be moved around as needed.

Mist’s solution provides what Wi-Fi has always aspired to be, and then some―a predictable, reliable and self-healing Wi-Fi solution based on extensive data collection, AI and machine learning.

There are no dumb smart questions

If you have questions about smart devices, IoT, Wi-Fi solutions―including Mist Systems’― contact the talented, tenured solutions architects and engineers at GDT’s IoT and Mobility Solutions practice. They can be reached at Mobility_Team@gdt.com. They’d love to hear from you.

For more about Mobility Solutions and IoT…

Click here to get more information about mobility solutions, and here to watch a video about how GDT delivered a secure mobility solution to a large retailer.

The 6 (correctly spelled) R’s of a Cloud Migration

By Richard Arneson

It’s always confounded me that two (2) of the three (3) R’s of education―reading, writing and arithmetic—were spelled wrong. Whomever coined the phrase was obviously trying to set students up to fail at spelling. Thankfully, we work in an industry that understands the proper spelling of R words; in this case, I’m referring to the six (6) R’s of a cloud migration. That’s not to say you have to pick just one (1), though. It’s not an either/or scenario. Your organization might require, if you want to fully enjoy the cloud and all it has to offer, several of the following types of cloud migrations. That’s where the experience and expertise comes in.

Re-host (aka Lift and Shift)

Re-hosting applications to the cloud is common, especially if a company wants to ramp up their cloud migration as quickly as possible. For instance, there might be a certain business case that demands a fast deployment. In re-hosting, applications are re-hosted in the cloud, even if cloud optimizations haven’t taken place. As a result, companies can enjoy quick savings, but not everything they might want due to the abbreviated time line.

If workloads and applications have been re-hosted, it can make it easier to optimize and re-architect in the future. Amazon Web Services (AWS) has a solution for this called Snowball, which securely transfers data at petabyte-scale into and out of their cloud. Also, their VM Import/Export automated transfer tool allows you to utilize existing VM purchases by easily importing them into the AWS Cloud.

Re-platform (aka Lift, Shift and Tweak)

Re-platforming takes the re-hosting approach, but also addresses a common issue―not all applications can be migrated to the cloud. While an application may not be able to run on an IaaS platform, it may be able to run on IaaS servers. In this case, an emulator can be used, which runs in the cloud of the provider you choose (AWS, Microsoft Azure, Google Cloud). The applications will appear no different to end users―same front end, interfaces, look and feel.  If rebuilding a current system is cost prohibitive, you can still enjoy cloud technologies on a legacy infrastructure through re-platforming.

Re-architect (aka Re-write)

Re-architecting is like purchasing a Mercedes with all the options and features attached. Yes, it’ll cost you, but if you’re looking for a superior level of performance, business continuity, flexibility and scalability, this will be your best option. It’s a good bet that companies touting and enjoying tremendous cloud benefits have utilized this migration strategy.

And if you initially choose to re-host an application, that doesn’t mean you can’t re-architect it in the future. If you’d like, re-host now, re-architect later. Doing so can reduce the project’s complexity by separating application re-design from the cloud migration.

Re-purchase (aka Drop and Shop)

Think Salesforce. Think SaaS. Re-purchasing is simply a matter of changing the licensing. In the case of Salesforce, you’re going from a legacy CRM to a cloud option. You’ll save both hard and soft costs, such as the time it takes an IT staffer to manage, maintain and monitor the application.

Retire (aka Curbside pickup)

One of the key elements of creating a cloud migration strategy is to first conduct a thorough assessment of your existing environment, applications, workloads, etc. If done properly and comprehensively, the assessment will be able to determine which IT elements can be hauled out to the trash. And with retirement comes cost savings.

Retain (aka You can stay…for a while)

If you’re not ready to move a particular application to the cloud for whatever reason (depreciation, performance concerns, gut feeling…), you may want to keep the status quo for a while. That’s not to say you’ll want to retain it forever. The more comfortable you become with the cloud and a migration, the sooner you’ll probably begin to move applications onto the Retire List.

It all starts with Expertise―then an Assessment

Moving to the cloud is a big move; it might be the biggest move of your IT career. If you don’t have the right cloud skill sets, expertise and experience on staff, you may soon be wondering if the cloud is all it’s cracked up to be.

That’s why turning to experienced Cloud experts like those at GDT can help make your cloud dreams a reality. They hold the highest cloud certifications in the industry and are experienced delivering solutions from GDT’s key cloud partners―AWS, Microsoft Azure and Google Cloud. They can be reached at CloudTeam@gdt.com. They’d love to hear from you.

 

If you’d like to learn more about the cloud, migrating to it, considerations prior to a migration, or a host of other cloud-related topics, you can find them here:

Are you Cloud Ready?

Calculating the costs–soft and hard–of a cloud migration

Migrating to the Cloud? Consider the following

And learn how GDT’s Cloud Team helped a utility company achieve what they’d wanted for a long, long time:

A utility company reaps the benefits of the cloud…finally

Brazil now, U.S. later?

By Richard Arneson

Hopefully the answer is a resounding “NO”, but the Brazilian banking industry has recently been hit hard by “GhostDNS”, so named by China-based security research firm NetLab, which discovered the sinister malware in September. The phishing infection has hijacked over 100,000 routers in South America’s largest country and hoarded customer login information for many of its largest financial services firms. It’s estimated that it has been running undetected since June of this year.

Domain Name Service (DNS) simplifies the lookup of IP addresses associated with a company’s domain name. Users can remember GDT.com, but servers don’t understand our nomenclature. They need an IP address. Without DNS, the Internet, which processes billions of requests at any given moment, would grind to a halt. Imagine having to keep track of all the IP addresses associated with the thousands of websites you visit, then typing them into a browser.

Here’s how GhostDNS works

GhostDNS is spread through remote access vulnerabilities and can run on over seventy (70) different types of routers. NetLab identified over a hundred (100) different attack scripts that were deployed and discovered them running on several high-profile cloud hosting providers, including Amazon, Google and Oracle.

The attack scripts hijacked organizations’ router settings, which resulted in their traffic being sent to an alternative DNS service. This re-directed traffic headed to rogue, or phony, sites designed to mimic the landing pages of Brazil’s major banks (some telecom companies, ISPs and media outlets were targeted, as well). Users believed they were on “real” landing pages, then happily typed in their user name and password.

While GhostDNS malware has primarily affected routers in Brazil, which is one (1) of the top three (3) countries affected by botnet infections (India and China rank 1 and 2, respectively), the FBI is working to ensure it hasn’t spread to the United States. If you believe your organization may have been infected by GhostDNS, the FBI has provided an easy online way to determine that very issue here. Just type your DNS information into the search box. it’s that simple.

A four-pronged module approach to evil

  1. A DNSChanger module attacks routers that, based on collected information, are deemed target-worthy due to weak or unchanged login credentials or passwords.
  2. A Web Admin module provides1 a portal, of sorts, where attackers can access the phony login page.
  3. A Rogue DNS module resolves the domain names to which users believe they’re heading. Again, most of these domain names are of Brazilian financial institutions.
  4. The Phishing Web module is initiated after the goal of the Rogue DNS module has been satisfied. It then steers the fake DNS server to the end user.

As the result of NetLab’s detective work, the further spreading of GhostDNS appears to have been reined in. Networks have been shut down so remediation and enhanced security measures can be implemented. But rest assured, something as big, or bigger, will soon take its place.

IT Security questions? Turn to the Experts

GDT is a 22-year-old network and systems integrator that employs some of the most talented and tenured security analysts, solutions architects and engineers in the industry. They design, build and deploy a wide array of solutions, including managed security services and professional services. They manage GDT’s 24x7x365 Network Operations Center (NOC) and Security Operations Center (SOC) and oversee the networks and network security for some of the most notable enterprises, service providers and government agencies in the world. You can contact them at NocASALL@GDT.com They’d love to hear from you.1

A Robust Solution for the Entry-Level storage customer

By Richard Arneson

If your backyard is the size of Greenwich Village apartment, you probably wouldn’t buy a tractor with a mulching attachment to mow the lawn. The same holds true for technology solutions. Why should only the biggest of the biggies get to enjoy best-of-breed, cutting-edge technology solutions? And why should they have to pay higher prices, only to be told that economies of scale prevent them from enjoying more aggressive pricing? Well, based on the recent introduction of its next generation PowerVault ME4 Series family of storage arrays, Dell EMC’s answer is obvious―They shouldn’t.

Small- and Medium-Sized (SMB) companies not only compose the vast majority of businesses in the United States, but they account for well over fifty percent (50%) of all sales. Those figures aren’t lost on Dell EMC; they obviously understand the importance of providing solutions to those businesses who fill up the SMB space. Their PowerVault ME4 storage arrays allow SMBs to purchase storage arrays that perfectly fit their needs and come at a budget-friendly, easily-digestible price point.

A storage solution that meets the demands unique to SMBs

The importance of workloads is every bit as important to small- and medium-sized businesses. Theirs might come in slightly different flavors, however, and can include everything from databases and disk backups, to applications needing a solid SAN/DAS solution and virtual desktop infrastructures (VDI). But smaller companies with fewer IT staffs are met with everything their enterprise counterparts are; namely, the expectation to manage diverse sets of IT infrastructure solutions.

The initial phase of Dell EMC’s goal to deliver a simplified storage portfolio was accomplished in early 2018 when they introduced PowerMax, its enterprise-class solution.

Not their first SMB Storage Solution rodeo

The reason behind Dell EMC’s introduction of PowerMax and PowerVault ME4 can be boiled down to one (1) word―simplification. But that’s not to say they haven’t been delivering great storage solutions for the SMB market. The IDC’s Q2Y2018 Worldwide Quarterly Enterprise Storage Tracker listed Dell EMC as the leader in the entry storage market. In fact, they hold a thirty-one percent (31%) revenue share in this segment. With its introduction of PowerMax and PowerVault ME4, that percentage will soon get larger.

More features, three (3) great options to choose from

The PowerVault ME4 solutions portfolio, while certainly delivering simplicity, features a number of improvement over their previous storage offerings, including larger capacity, faster performance and all-inclusive software.

Dell EMC’s PowerVault ME4 solution comes in three (3) different flavors to accommodate the precise needs of the SMB market. The ME4012 features twelve (12) drives in a 2U (3.5” high) profile and the ME4024, also 2U, comes with twenty-four (24) drives. Its big dog, the ME 4084, is a 5U (19” high) array with eighty-four (84) drives. Their starting price is staggeringly low and can comfortably fit into any IT budget.

The PowerVault ME4 solutions are highly-optimized and purpose-built for SAN and DAS environments, can be configured from 0-100% flash, are expandable up to 4PB and can drive up to 320K IOP. And, as previously mentioned, all include the software you’ll need to manage, store and protect your data. And whether connecting to a high- availability SAN environment or integrated with a Dell EMC PowerEdge Server, simplification is the operative word. And they can be quickly configured with a new intuitive HTML5 web-based interface, so management can be conducted anywhere, at any time.

If the primary word is simplification, protection isn’t far behind

With PowerVault ME4 arrays, RTOs (recovery time objectives) and RPOs (recovery point objectives) can be addressed and met through snapshots and IP replication and asynchronous multi-site FC capabilities. The result? Data protection and robust disaster recovery options.

Need more info about Dell EMC storage solutions?

Turn to the storage experts at GDT. For the past twenty-two (22) years, GDT has been a leading network and systems integrator by partnering with industry leaders, such as Dell EMC, HPE, Cisco, VMware, Pure Networks, Citrix, F5 Networks, and dozens of others. Our tenured, talented solutions architects and engineers deliver customized, cutting-edge client solutions with best-of-breed technologies that lead customers on their digital transformation journey. For more information about your storage solutions options― whether you’re in the SMB or enterprise market―contact GDT’s  solutions architects or engineers at SolutionsArchitects@gdt.com or at Engineering@gdt.com. They’d love to hear from you.

For more information about storage solutions, read: Flash, yes, but is it storage or memory?

Don’t put off ‘til tomorrow what you must do today

By Richard Arneson

Disaster Recovery planning is like insurance―you know you need it, but there’s nothing fun about it. And that’s before you’ve even paid a premium. It’s easy to file it into one (1) of two (2) categories: I’ll get around to it or It’ll never happen to us. And like insurance, taking either philosophy could leave behind a wide swath of damage from which total recovery may be impossible.

Actually, there’s a third reason disaster recovery planning is often the victim of procrastination―it’s not easy. In fact, it can be very complicated if done, well, properly. But it’s needed; not later, but now.

The following are ideas to consider prior to sitting down to take that first stab at creating a plan. There’s no question, each of the five (5) points will spawn a myriad of additional things to consider, but it will get you headed in the right direction.

Create a DR Team

Developing a Disaster Recover Team that meets regularly, even after the plan has been crafted and tested, will help create a more collaborative, open attitude toward disaster planning. Incorporate a wide-range of skill sets within the team, and each member should have a well-defined role. In addition, each should have backup roles; for instance, somebody whose primary responsibility is applications might have a secondary role working with the telecom department.

Inventory your Assets

An IT inventory must be conducted to include all applications, network assets, applicable personnel and vendors. Create a list of all that will be needed to recover from a disaster. Include network diagrams and any recovery sites, and ensure all equipment, including cables and wiring, are labeled. It might sound elementary, but if it’s not done, tracing cabling back to devices will take time and create unnecessary costs and headaches.

Once you’ve inventoried personnel and vendors, create a call list that―regarding personnel―details their responsibilities and backup assignments. Assign the management of the call list to one (1) person to avoid any blame games. And make sure they’re held accountable for updating it regularly.

Document the Plan

Once inventories have been conducted and verified for accuracy, include any pertinent information, such as software licenses and asset lifecycles. And while it hopefully won’t be needed, include information about applicable insurance, including policy numbers. If you’ve designated a recovery site, include information and maps about how to get there. Don’t leave out something because you assume it’s widely known. If you’re going to assume anything, assume that whomever refers to the plan knows nothing. You won’t offend anybody for including information that seems rudimentary or unnecessary. What will be offensive is if personnel refer to the plan and it’s unclear.

Now Test it…and test it…and test it

Prior to testing your plan, which should be conducted at least once a year, script it out, then rehearse it with key personnel. If you’re concerned that testing the entire plan will pull employees off projects for extended periods of time, test subsets, or smaller chunks, of it. But like anything, the more you rehearse the better you get. You can throw in some curveballs and see how the backup planning works. Pretend certain staff members are on vacation; see if their backup is ready to enter the game and make a difference. Or test it with personnel who have had nothing to do with its creation. Get creative, pretend you’re a football coach. Throw a variety of issues at your plan and personnel and see how well it stands up. See if your documentation is easy to follow and covers all the bases.

Get Executive Buy-In

Make sure to get executives to understand the importance of a DR plan and why taking time to create and test it on a regular basis will mean taking personnel off of projects or initiatives from time to time. Ensure they understand that creating a DR Plan will encompass all departments and key stakeholders from each, and that the plan isn’t static―it needs to be re-evaluated, edited and tested on a regular basis.

Need more info about creating a DR Plan?

Turn to the experts. For the past twenty-two (22) years, GDT has been a leading network and systems integrator by partnering with industry leaders, such as HPE, Cisco, Dell EMC, VMware, Pure Networks, Citrix and F5 Networks. Our tenured, talented solutions architects and engineers deliver customized, cutting-edge client solutions with best-of-breed technologies that lead customers on their digital transformation journey. For more information about creating a DR plan for your organization, contact GDT’s  solutions architects or engineers at SolutionsArchitects@gdt.com or at Engineering@gdt.com. They’d love to hear from you.

And if you’d like to learn more about DR plans, you can read about them here:

DR in the Cloud

How do you secure a Cloud?

Want to read about a cool, real-world blockchain application? Oh, and it’s not Bitcoin

By Richard Arneson

With a value of almost $300 billion, retail juggernaut Walmart, which includes Sam’s Club, has turned to blockchain to keep customers safe from future produce-related illnesses. It’s estimated that outbreaks related to food borne illnesses, like those that occurred in April due to E. coli-tainted romaine lettuce, result in combined costs over one hundred fifty billion dollars ($150bn) due to medical care, sick days taken from work and discarded food.

Walmart announced that by January of 2020 all their California-based produce suppliers―Dole, Fresh Express and Taylor Farms, to name a few―will be required to join its blockchain-based supply chain, which they’ve been working on and testing for the past two (2) years. They’re confident the technology will make it far easier to trace the source of any produce containing dangerous bacterial strains, such as E. coli, listeria, salmonella and campylobacter. And Walmart isn’t just turning to blockchain to keep customers safe; they estimate that implementing it in their supply chain will save them considerable time and millions of dollars they lose due to recalls.

They’re focusing more specifically on produce suppliers from California’s Salinas Valley region, which is where the vast majority of April’s tainted romaine lettuce was grown. It was reported that the lettuce killed five (5) people and hospitalized over two hundred (200) nationwide. Consumers were advised to refrain from purchasing romaine lettuce grown in California or Arizona, that state’s Yuma region reported to be another, albeit small, source of tainted lettuce. This advisory, while necessary, proved somewhat ineffective as consumers found it hard to determine where their purchased lettuce had been grown. The Yuma-grown lettuce resulted in only a single incident, which occurred at an Alaska correctional facility.

Here’s how Walmart’s blockchain-based supply chain works

Dole, which is Walmart’s largest supplier of produce, has participated in their blockchain trial for almost two (2) years. Blockchain’s distributed ledger concept serves as a decentralized accounting system that will be accessible, once it’s widely deployed, by all of Walmart’s produce suppliers.

The initial block of the chain originates from the grower, after which packers and shippers enter their pertinent information on the next block. At that point Walmart receives the chain, which is entered into its distribution system. All parties involved will be able to see the entire ledger, meaning all blocks attached to the chain. While Walmart’s vendors won’t know which company entered information, they’ll be able to see that a particular element within the supply chain was completed.

Here’s what blockchain will provide Walmart and its suppliers

These recent outbreaks of E. coli have been the worst produce-related ones in history. Current supply chain methods, which are widely used in the grocery industry, meant tracking the source of tainted romaine lettuce took a considerable amount of time. And while the FDA (Food & Drug Administration) and CDC (Centers for Disease Controls and Prevention) were busy trying to track down the source of the afflicted lettuce― which took weeks―it was being distributed and consumed far and wide.

While safety certainly takes precedence, Walmart will also enjoy other benefits, as well, such as faster payments to their produce-related vendors. It will also assist them in determining which products have the longest shelf life. And consumers, who are more in tune with what they’re putting into their bodies these days, will be able to access far more information about the food they’re eating and feeding to their families.

Presently the FDA requires that companies within a grocer’s supply chain have to maintain information only on whomever lies one step ahead and one step after them. This lack of intelligence makes it hard for the FDA and CDC to trace the source of bacterial strains. Add to that the fact that on average there are almost a thousand food-borne illnesses each year (that number will probably be eclipsed in 2018).

Questions about which technologies can help you meet your digital transformation goals?

The talented, tenured technologists at GDT can provide the answers. They’ve implemented cutting-edge solutions for customers of all sizes and from a variety of industries. You can reach them at Engineering@GDT.com or SolutionsArchitects@GDT.com. They’d love to hear from you!

And check these out…

You can get a little more educated about blockchain here. And click here to watch a great Lunch n’ Learn video presentation on blockchain conducted by GDT Network Engineer Ryan Rogers.

Gen V―very important, but probably not what you think it means

By Richard Arneson

Gen X, Gen Y, NextGen, 5G, 4G…if you could buy stock in the number of ways Gen has been used, I’d be the first to reach for my checkbook. Here’s another one, and may possibly be the most important―Gen V.

Gen V is what Checkpoint, a 25-year-old leading provider of cyber security solutions, dubbed the latest generation of cyber threats. Just yesterday, in fact, there were over 12 million attacks…and that wasn’t even an outrageously intense day in the cyber threat world. You can get more details at ThreatCloud,  Checkpoint’s worldwide threat map. It’s fascinating, but bone-chilling scary when you see the threat totals at the upper left.

To fully understand Gen V, you might find it useful to learn about Gen’s I through IV

Gen I

Remember how awesome it was to have you own PC, only to have that excitement spoiled, at least somewhat, once your learned about hackers? The bad guys launched viruses, and the nascent IT security industry returned serve with anti-virus products. Simple enough.

Gen II

Once the Internet became as much a part of lives as central heat and air, the hackers followed suit. The Internet allowed them to communicate better, collect information easier, and raise the stakes to benefit financially. Gen II allowed maliciousness to reach a much broader audience by ushering in software that could be launched corporate-wide. A single, infected PC would result in widespread, crippling infections. Security vendors responded with intrusion detection systems (IDS) and firewalls.

Gen III

Not surprisingly, attackers eventually found a way to breach those firewalls and intrusion detection systems and did so, in part, by becoming experts at analyzing victims’ software and networks. This resulted in the IT industry determining that a more active, less reactive, approach to security was needed. For instance, Checkpoint began to focus on better preventative measures and launched their IPS (intrusion prevention systems) products.

Gen IV

With Gen IV, threats became more sophisticated and resulted in everything from breaches that exposed personal information to national security threats, including―gulp!― international espionage. Gen’s II and III resulted in better inspection of traffic but failed to inspect and validate content that could be included in emails and downloads. Checkpoint responded with sandboxing and anti-bot products that beautifully addressed this new level of maliciousness, including zero-day attacks, which refer to flaws that organizations didn’t even know existed. They’re called zero-day attacks because they can be exploited immediately, providing victims zero time to create and load necessary patches to address vulnerabilities.

Gen V Attacks―when the bad guys bring out the big guns

If Gen I through IV attacks are guns, tanks and rocket-propelled grenades, Gen V represents bombs of the atomic or nuclear variety. Wide-scale infection and destruction ensues from Gen V attacks, as blistering, multi-vector attacks are covertly leaked and launched. The resultant casualties can number into the millions, as prior Gen tools and product sets prove no match for this new, heightened level of digital evil. Checkpoint determined that security needed a more integrated and unified approach to security. They developed a unified architecture with an even higher level of advanced threat protection, and included the sharing of real-time threat intelligence. Their Gen V security solutions address customers’ mobile devices, their use of the Cloud, remote offices, even virtual endpoints.

The Security Check-Up

GDT’s July 17th blog entitled Rx for IT Departments: a Security Check-Up addresses the importance of conducting a security check-up for your organization. To dovetail with that, Checkpoint provides an online security tool called CheckMe. It runs simulations to test the security of your network and its endpoints, including your organization’s use of the cloud and mobile devices. And it comes at the perfect price of free!

Call on the security solutions experts

GDT’s tenured and highly certified security professionals are experienced at implementing managed security solutions, including those from premier partner Checkpoint. After years of working closely with Checkpoint, it comes as no surprise that they have been recognized for the 7th year in a row as a leader in Gartner’s annual Magic Quadrant for Unified Threat Management (UTM). For more IT security information, contact GDT’s security professionals at SOC@GDT.com. They’d love to hear from you.

Why should I care about 5G?

By Richard Arneson

Like the G’s that have preceded it, 5G has gotten a lot of press and pub for seemingly years. In the IT industry, however, months can feel like years. Eager technophiles are anticipating the day when they can use―then proudly broadcast that to the world―whatever technology we’ve been hearing about for months and months. Welcome to the current hottest of topics―5G.

But first, a quick walk down the memory lane of mobile phones

1G

1G was the first generation of wireless communications. Actually, using the word communications is a little misleading. It suggests that there was more than one (1) type of communication; 1G delivered only voice. Think back to the 1980’s when cells phones first became available. It felt like only the top 1% of wage earners had one. The cell phones were heavy and comparable in size to a World War II field phone. They couldn’t fit in your pocket and could only be stuffed into a briefcase with expanding sides. These phones weren’t even digital, but analog, and the battery seemed to always last less time than the call you were on.

2G

Introduced in the early 1990’s by the Finnish, 2G provided something so cutting edge at the time that people used its key feature to transmit things like “Hi”, “Hello”, “Are you getting this?” and “Do you believe this actually works?” Yes, it marked the advent of text messaging. Also known as SMS (short message service), its next evolution, MMS (multimedia messaging service), allowed pictures, audio and video to be attached to text messages and transmitted. The max speed went from 1G’s 2.4 Kbps to 50 Kbps. Incomprehensible…at the time.

3G

Not to give short shrift to 1- and 2G, but 3G, which was introduced to the marketplace in the late 1900’s, was arguably the first “next generation” in which the general public really began to take notice. And why not when speeds shot up to 2Mbps and it marked the first time the words mobile and broadband were linked together. Users began to use their phones to access the Internet and stream content. It also accompanied what some at the time (OK, I was one of them) considered a little crazy, something that would rarely be used, if ever, on a cell phone―the camera.

4G

In 2008, 4G, our current standard, perfectly helped usher in the smart phone. It delivered speeds up to 100 Mbps, which was required considering consumers began using smart phones for gaming, HDTV and videoconferencing…all those applications that demand crazy high-speed data transmission. Remember Apple’s 2007 introduction of the iPhone with its “Hello” advertising campaign that first aired during the Academy Awards broadcast?

Hello 5G

While not quite here, 5G is just around the corner. Here’s what it will mean to consumers:

Faster speeds

5G touts the delivery and downloading of data much, much faster, which is a feature that shouldn’t come as a surprise to anybody. A new generation of wireless without faster speeds would be like a new music technology that doesn’t profess clearer and more dynamic sound. Speeds for 5G are supposed to be over ten times (10x) that of 4G, or around 1 Gbps.

Latency

Latency, or the time it takes to move data from device to device, will be greatly reduced with the introduction of 5G. While 4G might be fitting the bill for your current needs, lower latency will prove critically beneficial, even lifesaving, for certain applications, such as surgery or the need for real-time data delivery to and from connected cars.

IoT

Faster speeds, lower latency…both can be chalked up to the need for each in the IoT world. In the next four (4) years, the number of IoT devices in use today (17 billion) will double, and with that precipitous growth comes the need for more cells to pick up and transmit the data. With 5G, smaller amounts of data will be transmitted by lower frequencies, while larger, bandwidth-hogging amounts will occur at higher ones. These multiple frequencies will require service providers to deploy smaller, but densely packed, cells on existing towers. These cells will determine the type of data, and its resultant frequency, that needs to be transmitted.

But before you get your credit card out…

It’s estimated that 5G won’t be fully deployed until 2022. Remember, the service providers don’t, and can’t, roll out a next generation wireless technology at once. They have a lot of cell towers to upgrade, so it’s implemented in stages. But all of the major carriers will begin 5G implementation in selected markets by the end of 2018―yes, that’s this year and only three (3) short months away.

Mobility and IoT Experts

If you’d like more information or have questions about what 5G can and will mean to your organization, contact the talented, tenured solutions architects and engineers from the IoT and Mobility Solutions practices at GDT. They can be reached at Mobility_Team@gdt.com. They’d love to hear from you.

For more about Mobility and IoT…

Click here to get more information about mobility solutions, and here to watch a video about how GDT delivered a secure mobility solution to a large retailer.

Busting myths about fiber optics

By Richard Arneson

How often do you and your buddies sit around and talk about fiber optics? That little, huh. It would be a bit like chewing the fat about your home’s electrical wiring. Sure, it could happen, but conversations related to politics, sports, religion, et al. will probably trump wiring every time. Fiber optics is a lot like electricity―it’s been around a long time, is reliable, and we only talk about it when it doesn’t work. Oh, and life without it just may prove unlivable. For instance, if you’re thinking you’ll use your smart phone to hop on the Internet or make a phone call, it won’t be possible without fiber optics. While you don’t see fiber stands dangling from your smart phone, there’s a little thing called wireless backhaul. After your wireless voice or data hits the nearest cell tower, those 1’s and 0’s are carried back to the service provider’s network via…fiber optics. And that’s just a small example why fiber optics, whether you realize it or not, is as critical to our way of life as electricity.

So in the event you hear any of the following disparaging remarks about fiber optics, rest assured they’re all myths.

Myth 1—Fiber optics is glass…of course it’s fragile

Just the word fiber should be enough to debunk this myth. Think about fiberglass and its many durability-required uses. It’s composed of glass fibers and wraps the car you drive. Fiber optics, when compared to its copper counterpart, is considerably more durable. While tugging on it isn’t recommended, its pull tension is much stronger than copper or coax. And it’s far better equipped to handle the wide array of environmental conditions that are thrown at it. Consider water, for instance. Copper carries signals electronically―not good when mixed with water. Not true with fiber optics. It carries signals with a beam of light. Try this one on for size: the fiber optics used outdoors has a 600- to 800-pound tension rating. Not to suggest that you can swing on it, but it’s super strong. Busted.

Myth 2—Fiber optics is very pricey

This myth was once true, at least partially, but at present installing fiber optics is comparable to the cost of installing copper or coax. Its price has steadily decreased due, in part, to advances in signal termination technology—cheaper and more efficient. Also, less equipment is needed for fiber networks, and, because it doesn’t utilize electricity, it can even lower your utility bills. Busted.

Myth 3—Fiber optics installations are difficult

Like the price myth, this one was at one time factual. But that fact dies sometime in the mid-1990’s. For years fiber optics has been the standard of choice for service provider backbones. If field operations personnel aren’t comfortable with working on and installing fiber optics by now, their skill sets are about twenty (20) years behind the times. And due to fiber optics’ lack of an electrical current, there are fewer routing restrictions and no need to worry about electromagnetic interference (EMI) and radio frequency interference (RFI). Busted.

Myth 4—Bend it and you’re cooked

There was a time when fiber optics was more sensitive to bending, but this has always been a myth. Yes, it was once a little less bend-friendly, but now insensitive fiber is used in the event a super tight radius is required. This is just one of the many reasons why fiber optics is so amazing. Insensitive fiber has a trench that surrounds the fiber, but is inside the cladding encasing it. This tiny trench is highly refractive, so any light that escapes the fiber due to a tight radius is refracted back to it. If you could bend a mirrored tube around, say, a telephone pole and shine a flashlight in one end, light would exit the other, right? This is very similar to how insensitive fiber works, except that, technically, mirrors reflects light and fiber optics refracts it. Busted.

As a side note, insensitive fiber is solely used indoors; outdoor applications will never require that tight of a turn radius. If it does, its layout has been poorly planned.

Now for some quick FACTS about fiber optics

It’s super-fast (only slightly slower than the speed of light), has far less attenuation (signal loss) than copper or coax, is impervious to EMI and RFI, doesn’t pose a fire hazard, and doesn’t require replacement nearly as often as coax or copper. Are those some of the many reasons why fiber optics will be around and continue to be vital to our lives for a long time to come.

For questions, turn to these optical networking experts

If you have questions or would like more information about fiber optics or optical networking, contact GDT’s Optical Networking practice professionals at Optical@gdt.com. Composed of experienced optical engineers, solutions architects and project managers who specialize in optical networks, the GDT Optical Networking team supports some of the largest service providers and enterprises in the world. They’d love to hear from you.

For additional reading material about fiber optics, check these out: A fiber optic first and When good fiber goes bad.

Blockchain―it’s more than just Bitcoin

By Richard Arneson

Think back to that Accounting 101 class you took in college. As an English major, I found the class to be miles on the other side difficult. I thought I’d accidentally signed up for the CPA prep course. But the first thing you learn is (let’s all say it together) Debits to the left, credits to the right. What you add or subtract from one side, you do the opposite to the other. Add in a credit, and subtract that amount from debits, and vice versa. And with that, a ledger has just been described, which is exactly what Blockchain is. Blockchain’s first and most widely publicized product is Bitcoin, a cryptocurrency that makes a ledger available to anyone, whether they’re involved in a transaction or not. However, this public ledger doesn’t disclose parties involved in any of the transactions.

Blockchain was created in the late 1990’s and is a comprehensive listing of records linked, or chained, together. Bitcoin runs on the Blockchain platform and blends together the worlds of technology and finance. It was created in 2008 by Satoshi Nakamoto, a pseudonym for either a person or group of people―nobody’s quite sure which. Bitcoin has been one of the most talked about topics in years for two (2) reasons:

  1. Tremendous gains for investors trading (as in buy low, sell high trading) in Bitcoin have been widely reported, even though news about the hefty transaction fees administered by Bitcoin exchanges have been reserved for the back page.
  2. Bitcoin has been the primary currency demanded by those who launch ransomware due to the erroneous belief that they’re untraceable. They can be, but I’ll share that for a future blog.

In Blockchain, including Bitcoin, of course, those spaces in which you enter debits or credits are called―appropriately―blocks. And those blocks are chained together (name make sense now?). Each block contains a user’s unique identifier and information about both the transaction and the block that precedes it. Each block further strengthens the chain by verifying the previous block. The more blocks, the more times the chain gets verified. And because Bitcoin, like Blockchain, is a distributed ledger and not a database, it can’t be altered or accessed.

How are Bitcoin transactions conducted?

Each Bitcoin user has an account, known as a Bitcoin wallet, in which their Bitcoin balance and information about all their transactions is maintained. If a user needs to send Bitcoin to another user, they publish their intent to do so, after which Bitcoin nodes receive the information and verify that the sender has enough money in their wallet and hasn’t already sent it to somebody else. Once that’s completed, a block is created that includes the sender’s identifier, information about the transaction, including the recipient’s unique identifier, and the preceding block in the chain.

Bitcoin uses Blockchain, but Blockchain is more than just cryptocurrency

Bitcoin and Blockchain are mistakenly used interchangeably. Blockchain is a platform utilized by Bitcoin. In fact, Bitcoin is only one (1) of hundreds of applications that utilize Blockchain. While the word ledger brings to mind numbers, Blockchain can provide a ledger, of sorts, for other things, including contracts, land registries, medical records, music rights for privacy prevention, and many, many more.

Oh, and about those photos…

Blockchain is a fairly straightforward technology, but, in the case of Bitcoin, those stock photographs of shiny gold Bitcoins posted in just about every article you’ve seen on the subject have only added to any confusion. Remember, Bitcoin is virtual currency and utilizes cryptography to secure and verify transactions. No, there are no physical, tangible Bitcoins. You can’t stuff them in your pocket, lose them between sofa cushions or find them at the bottom of your clothes dryer.

If you’re still in need of a visual, this might help—Click Here to watch real live Bitcoin transactions. Whether you consider this fun is up for interpretation, but if you need the same info and with a more appealing, albeit less detailed, format, you can find it at bitbonkers.com.

Questions? Turn to the Experts

GDT is a 22-year-old network and systems integrator that employs some of the most talented and tenured solutions architects and engineers in the industry. They design, build and deploy a wide array of solutions, including managed services, managed security services and professional services. They operate out of GDT’s 24x7x365 Network Operations Center (NOC) and Security Operations Center (SOC) and oversee the networks and network security for some of the most notable enterprises, service providers and government agencies in the world. You can contact them at SolutionsArchitects@gdt.com or at Engineering@gdt.com. They’d love to hear from you.

And for more great information about Blockchain and Bitcoin, watch GDT Network Engineer Ryan Rogers conduct a great Lunch ‘n’ Learn presentation on both here.

The European Union and cookies…not exactly a love story

By Richard Arneson

To detail in a book the benefits that the digital age has delivered over the past twenty (20) would make Moby Dick look like a brochure. In a much, much smaller book would be a list of any negative ramifications, most of which would fall under the label Security. Here’s a third book: Annoyances. Sure, they’re far outweighed by the benefits, but they’ve afflicted everybody who’s turned on a computer, smartphone or tablet to access the Internet.

For years it was buffering, which left the user waiting and waiting―then grabbing coffee while waiting―as the small hour glass or spinning circle ostensibly meant your request was being processed. And how about the slow dial-up Internet connections, those noisy, awkward network handoffs, and the pop-ups, which are electronically akin to billboards randomly popping up in front of your car and bringing it to a grinding, screeching halt. We’ve got a new one—making the digital scene en masse is a new pop-up: The Cookie Consent Banner, brought to you by the European Union (EU).

Cookies, those of the electronic variety, have been around for years, and for the most part went unnoticed. You’d set up your browser to accept, not accept, or confirm their download before proceeding, but once that decision had been established in the browser settings, they didn’t provide much of a speed bump in the road. Cookies are small files that are essentially lookup tables and hold simple data, such as the user’s name. If accepted, they can be accessed by both users’ computers or web servers, and provide a convenient way of carrying data from session to session without having to re-enter the information.

In the past couple of months, however, the subject of cookies has been revitalized. Click on certain websites and you’re suddenly face-to-face with a pop-up banner that alerts you to the fact that the site utilizes cookies. Yep, a speed bump.

Why is the cookie consent banner showing up all the sudden?

The European Union, which was established in 1993, was an attempt to buoy the competitiveness of twenty-eight (28) member countries. It eliminates trade and monetary borders between EU countries, making for an easier flow of goods and services. And, yes, they established the euro, which is, behind the U.S. dollar, the most commonly held form of currency in the world. But in 2002, they took on another pet project―cookies. They determined that Internet users’ privacy wasn’t being adequately protected and cookie disclosure wasn’t being communicated. Hence came the EU’s Cookie Law, which is officially known as the 2002 ePrivacy Directive (ePD). The Cookie Law, or ePD, was not really a law, but a set of goals. It was up to each of the EU members to draft and enforce their own legislation based on these goals―most didn’t. Enforcement was minimal, if at all. See toothless.

In 2011, the EU enacted the ePrivacy Regulation (ePR), which, as its name suggests, actually is legislation that can be enforced EU-wide. The ePR incorporated other elements, as well, such as marketing efforts related to email, faxes, texts and phone calls. Unless you were directly affected by it, the ePR flew well under the radar. That is until 2017 when the EU updated the ePR and selected May 2018 as its launch date to coincide with that of The General Data Protection Regulation (GDPR). While the GDPR is not technically a subset of the ePR, it is somewhat overlapped by the latter, but focuses solely on users’ personal data. The ePR is broader in scope and protects the integrity and confidentiality of communications and data even if it’s not of a personal nature.

The good news? The ePR has already stated that in 2019 they’re going to introduce simplified cookie rules and make cookie consent a more user-friendly experience. Simplified cookie rules? More user-friendly cookie consent? Yes, it sounds like the EU considers the cookie consent banner an annoyance, as well.

Questions? Turn to the Experts

GDT is a 22-year-old network and systems integrator that employs some of the most talented and tenured solutions architects and engineers in the industry. They design, build and deploy a wide array of solutions, including managed services, managed security services and professional services. They manage GDT’s 24x7x365 Network Operations Center (NOC) and Security Operations Center (SOC) and oversee the networks and network security for some of the most notable enterprises, service providers and government agencies in the world. You can contact them at SolutionsArchitects@gdt.com or at Engineering@gdt.com. They’d love to hear from you.

When being disruptive is a good thing

By Richard Arneson

The Innovator’s Dilemma is a fascinating book written in 1997 by Clayton Christensen, a Harvard professor who coined the term disruptive technology. He considered it one (1) of two (2) technological categories, the other being sustaining technology. Christensen defined disruptive technology as any that, while being new, were so cutting-edge that they hadn’t yet been fully developed and thoroughly tested. As a result, he insisted, they might not be ready for prime time. Disruptive technologies create a lot of buzz and are rife with exciting possibilities, but aren’t viewed as being as “safe” as their sustaining counterparts. Imagine those days when consumers first heard about technology game-changers radios and televisions. They were highly disruptive and littered with issues.

Christensen lists sustaining technologies as those ones that, conversely, are being utilized and have resulted in measurable, sustainable results. If you’ve worked in telecommunications (especially in sales), there’s a decades-old axiom in that industry―nobody ever got fired for using AT&T. In other words, AT&T has been around the longest, has been used the most, and is considered the safest choice. If a CIO questions you about why you selected AT&T to carry your voice and data traffic, it’s an easily defensible decision. Getting a Fortune 100 Company to dive into the world of disruptive technologies may prove difficult. They’ll be far less inclined to utilize something that promises, but hasn’t yet produced, quantifiable results. Once it has, the flood gates will soon open. Oh, and by that time it will have become a sustaining technology.

The smartphone might be the most disruptive of technologies since the introduction of the telephone at the turn of the (last) century. Telephones disrupted several industries, putting a dent in paper manufacturing and the U.S. Postal Service. Now consider the smartphone. It has devastated an array of industries, including photography, publishing, music, GPS devices, even calculators.

The current biggies in the Disruptive Technologies category

Artificial Intelligence (AI)

AI, while being highly disruptive, frightens a lot of people. Whether they’ve been spooked by fictional, sinister robots of yesteryear, are worried about what it may do with its mass of collected data, or concerned that it will sound an employment death knell for a variety of industries, AI’s promotion to a sustainable technology will be here before you know it. Two (2) years ago, academicians and industry experts at the International Conference on Machine Learning predicted that by 2025 AI will outperform human thought. Wow.

Blockchain

Blockchain cryptocurrencies, such as its flag bearer Bitcoin, are no longer just a cryptic form of currency exchange that is preferred primarily by those who diabolically launch and hope to gain from disseminating ransomware. Many large, established banks worldwide are developing cryptocurrencies for selling financial products, such as bonds. Speaking of bonds, the SEC (Securities and Exchange Commission) now has a crypto-bond offering they’re calling Bond on Blockchain. And fund managers are now incorporating cryptocurrency into their portfolio mix.

Li-Fi (Light Fidelity)

In the event you haven’t heard of it, it’s cool and very disruptive. Called Li-Fi, which is short for Light Fidelity, light bulbs, of all things, will replace your home router. Deemed at to be at least a hundred times faster than Wi-Fi, Li-Fi utilizes an LED light bulb affixed with a digital processor that sends data with emitted light. Yes, the data is in the light. Let your mind wonder for a moment to consider how disruptive Li-Fi could be for any number of industries.

Need more info?

For the past twenty-two (22) years, GDT has been a leading network and systems integrator by partnering with industry leaders, such as, among many others, HPE, Cisco, Dell EMC, VMware, Pure Networks, Citrix and F5 Networks. Our tenured, talented solutions architects and engineers deliver customized, cutting-edge client solutions with best-of-breed technologies that lead customers on their digital transformation journey. For more information about the IT industries’ wide array of technologies, both disruptive and sustaining, you can contact our solutions architects and engineers at either SolutionsArchitects@gdt.com or Engineering@gdt.com. They’d love to hear from you.

Read about the differences between AI, Machine Learning and Deep Learning and Pure Storage’s answer to AI–Flashblade.

Sexy, yes, but potentially dangerous

By Richard Arneson

Apologies for the headline in the event you’ll soon label it as an act of sensationalism, but the topic of today’s blog needs to be considered, then forwarded, if you or others you know have implemented, or are in the planning stages of implementing, your organization’s IoT strategy. The IT industry is rife with two- to four-lettered initialisms or acronyms―SDN, BYOD, SLAM, SAN, BGT, CRC, IBT… we’ll stop there; this might be a list that is actually never-ending.

Unlike AI (there’s another one), which for some conjures up negative images, IoT (Internet of Things) is rarely the subject of similar scrutiny. IoT is exciting—sexy by IT standards—for several reasons, and one of the biggest is its ability of enable business owners to reach out to customers who might be standing outside their place of business, whether a storefront, bar or restaurant, at that very moment. Yes, when a technology can drive revenue, it’s always going to be a hot topic. But with the good comes bad, at least in the IT industry. And that bad usually falls under the heading Security. IoT, sadly, is no different, and the following represent the greatest present threats to IoT security.

The most prevalent types of security threats that affect IoT

Identity Theft

Identify thievery requires one (1) primary element―lots and lots of data. Now consider the number of IoT devices at play in addition to Smart phones―doorbells, thermostats, utility meters, watches, et al. They’re all connected to networks, which immediately broadens your attack surface. With personal data comes information, which can usher in a host of vulnerabilities. If patches or updates aren’t downloaded, or if, for instance, Alexa is traversing the same network you’re utilizing for Internet connectivity, you’ve created or broadened any gaps in security.

Con Artistry

Most consider themselves immune to this type of threat, but there’s certainly been victims who’d once believed that very thing. Protecting yourself against con artists sounds common sensical, but considerable IoT threats involves the inadvertent coughing up of sensitive information to those posing as bank employees or customer service representatives of a company you’ve done business with in the past. Usually these types of cons come in the form of email phishing, and the broad nets perpetrators cast are considerable.

Distributed Denial of Service (DDoS) attacks

When a highway, or any type of thoroughfare, is shut down, you’re denied the service that roadway provides. DDoS attacks are no different. They’re usually due to a botnet, which floods networks with requests sent at the same time by way more users than the network can accommodate. The thoroughfare comes to a grinding halt, but the goals of DDoS attacks have less to do with data gathering, and more with lost revenue and customers, including the sullying of a company’s good reputation that may have taken years to build.

Botnets

The aforementioned botnet is a combination of networked systems that take over a network and spread malware like the flu. The newly installed malware can result in a variety of costly symptoms, including the gathering of personal information and the spread of DDoS and phishing attacks, to name a few. The combined systems make botnets more insidious, as attacks can be spread from a variety of sources.

The Man-in-the-Middle

Remember the game Monkey in the Middle, where player C stands between players A and B and tries to intercept or block their pass? Man-in-the-Middle threats represent player C, which attempts to disrupt communications between users A and C. Here’s the difference: in a Man-in-the-Middle attack, users A and B don’t know there’s a user C in the game. Communications between the two (2) users are not only interrupted, but user C can then mimic users A or B―or both―to gather important and sensitive information. Intrusion detection system (IDS) are probably the best preventative measure against Man-in-the-Middle attacks and can detect when user C tries to insert itself into the conversation.

The IoT Industry is growing; unfortunately, so is its Attack Surface

It’s estimated that worldwide the number of IoT devices in use today will more than triple in the next seven (7) years, precipitously growing from its current 23 billion to over 75 billion by 2025. The cat and mouse game that steadily pits security organizations and experts against cyber attackers will only intensify. That’s exactly why consulting with IoT and security professionals like those at GDT is critically important now, but will become even more so over time. GDT’s Security practice is comprised of talented, tenured security analysts and engineers who protect the networks of organizations of all sizes, and from a wide variety of industries, including service providers and government agencies. They can be reached at SOC@gdt.com. They’d love to hear from you.

The “App” is short for appliance, not application

By Richard Arneson

In 1992, several years prior to the Dot.com Bubble and when cell phones were the size, shape and weight of a canned ham, a company was born in Sunnyvale, California, located at the bottom tip of the San Francisco Bay. NetApp was the brainchild of three (3) individuals who had once worked for Auspex, a company against which they’d soon compete, and, just a decade later, help send into Chapter 11 and the OEM scrap heap.

The Evolutionary Disruptor

Self-described as “the data authority for hybrid cloud,” NetApp made news in 2017 with its entry into the highly competitive Hyperconvergence Integrated Systems (HCIS) Market. In fact, their entry prompted Gartner to name them as an Evolutionary Disruptor in its 2017 Hyperconverged Integrated Systems (HCIS) Competitive Landscape study.

Originally, NetApp determined that they couldn’t optimally deliver to VMs the true value of its SolidFire Element OS, their proven storage OS. Once they made that determination, they knew that entering the HCIS Market was in their near future. This soul-searching helped them realize that, architecturally-speaking, it made a lot more sense to package its Element OS on bare-metal storage nodes so customers could take advantage of:

  • NetApp’s all-flash architecture,
  • Performance predictability through Quality of Service (QoS), and
  • Compression and Inline Deduplication across entire clusters.

Along with delivering the many benefits that HCIS delivers―the ability to better address exact compute and storage needs, rapid scaling and more predictable storage through more efficient consolidation―NetApp’s solution utilizes VMware’s bare-metal hypervisor (ESXi) on compute nodes and, along with a simplified installation process, customers can get their HCIS system up and running fast.

NetApp’s management UI enables customers to leverage any management technologies they’re currently utilizing, including VMware’s vCenter and vRealize for orchestration.

NetApp’s HCIS options

The NetApp HCIS offering starts with a minimum two (2) chassis, four (4) storage node configuration, after which additional nodes can be added independently. It offers four (4) flavors:

  • Small compute (16 cores) with small storage (5.5TB capacity),
  • Medium compute (24 cores) with medium storage (11TB capacity), and
  • Large compute (36 cores) with large storage (22TB capacity).

Taking advantage of its large base of installed customers

ONTAP is NetApp’s proprietary data management platform for its storage arrays, such as FAS and AFF, and that, combined with its SolidFire Element storage OS, allowed it to tap into a large base of existing customers and provide an ideal launching pad for its HCIS solutions.

Need more info? Reach out to the experts…

GDT’s team of highly skilled and talented solutions architects and engineers have deployed hyperconverged solutions for customers of all sizes, and from a variety of industries. They’re experts at delivering HCIS solutions from many of GDT’s premier partners, including  NetApp, of course, and helping customers enjoy the many benefits of hyperconvergence. They can be reached at SolutionsArchitects@gdt.com or at Engineering@gdt.com. They’d love to hear from you.

Read more about hyperconvergence here:

The Hyper in Hyperconvergence

Composable and Hyperconvergence…what’s the difference?

Hypervisor shopping? Consider the following five (5) things before taking out your wallet

By Richard Arneson

Whether you’re looking to implement a virtualization strategy, or are in the market to replace your current solution, you’ve got a decision to make―which hypervisor should I purchase? Remember, hypervisors are basically a platform for VMs and abstract physical resources from host hardware, such as, among other things, memory and processor. And each of the physical resources can be abstracted for each of the virtual machines at a physical level. For instance, a single server can be virtually turned into many, which allows multiple VMs to run off a single machine. (Click here for a refresher on the difference between hypervisors and containers).

First, determine which type of hypervisor you need

If you need to buy a bicycle for your upcoming 3-week mountain biking trip through the Sierra Nevada, you wouldn’t go shopping for a road bike with super narrow tires that can barely withstand the pounding of a pebble. You want a mountain bike that will fit the experience and help keep you upright while speeding down rocky, abandoned fire roads. You want the bike that will give you the best chance of success, enjoyment and safety. Hypervisors are no different.

While hypervisors perform an extraordinary service, there’s no doubt that naming the two (2) varieties was given little thought. Here they are―Type 1 and Type 2.

A Type 1 hypervisor is also referred to as bare metal, which simply means that it runs on the customer’s hardware. Type 1 hypervisors are the faster of the two (2), and require no OS acting as an intermediary, or middle layer, to slow it down.

A Type 2 hypervisor runs as a separate computer program on an OS, such as Windows or Linux. While they perform slower, they’re much easier to set up and great if a test environment needs to be quickly spun up.

Performance

If commodity isn’t the most over- and misused term in the IT industry, then it’s got to be a close second. There are some with the temerity to claim that hypervisors are a commodity, and that there’s little difference from one (1) to the next (it’s a pretty good bet that their sales numbers will somehow benefit from that uninformed characterization).

After determining the type of hypervisor you’ll need, it’s time to decide which is more important, high availability or flexibility, as in the need squeeze every ounce of performance from, as an example, CPU and RAM?

Hypervisors, unlike commodities, vary greatly from manufacturer to manufacturer. They’re complex, which is a given considering what they do, including, but not limited to, virtualizing all hardware resources, managing and creating VMs, handling all communications between VMs, and creating resource pools and allocating them to specific VMs. A commodity? Yeah, right.

Management Tools

If “hands-on” describes your VM management philosophy, then determining which hypervisor provides the best and/or most management tools should be a consideration. And those tools in question don’t just refer to ones of the out-of-the-box variety; understanding what are available as add-ons from 3rd party developers should represent part of your purchase criteria.

Overall Environment

If you think you’ve found the mountain bike you’d like to buy, but its support, documentation, and ability to utilize 3rd party accessories are limited, you might want to reconsider. The same holds true for hypervisors. If its support, including documentation, active and easily accessible user community, and ability to accommodate 3rd party developers is limited, this should weigh into your decision. That’s not to say you’ve found a lemon and it should be stricken from the mix, but deficiencies in these areas could prove frustrating, even costly, down the road.

Oh, yeah, the cost…

Pull out your paper and pen for the Pro’s & Con’s List. In short, you’re looking to strike the perfect balance between functionality and cost. Here’s where it gets tricky―the price range of hypervisors is wide, as in Pacific Ocean-wide. Some are not only priced to move, but are practically given away. Also, make certain you understand any associated licensing.

And, yes, you can utilize hypervisors from multiple vendors, but management tools, for instance, will vary from vendor to vendor, making management more complex. But if, for instance, certain workloads are less mission-critical than others, using different hypervisors might be the way to go.

…or you can turn to the hypervisor and hyperconvergence experts at GDT

The talented solutions architects and engineers at GDT have implemented a wide array of solutions for organizations of all sizes, including enterprises, service providers and government agencies. They are highly skilled at implementing solutions from GDT premier partners, including VMware for hypervisors, and hyperconverged solutions from HPE (SimpliVity), and Cisco (HyperFlex). You can reach them at SolutionsArchitects@gdt.com or at Engineering@gdt.com. They’d love to hear from you.

Disaster Recovery (DR) in the Cloud

By Richard Arneson

When organizations first began to realize that they’d become reliant on their computer systems, a new service was invented, or, at least, was needed―Disaster Recovery. Prior to that, disaster recovery meant little more than making sure your insurance premiums were paid up. This new reliance on computers―primarily due to mainframes in the early 1970’s―resulted in IT professionals beginning to ask themselves the same question: What happens to all of our vital information if [fill in the blank] happens? The first company to answer that question was SunGard, which provided customers exact, functioning duplicates, or “hot” sites, of their existing infrastructure. If the primary went down, the secondary was used. SunGard’s solution served its purpose, but was expensive and immediately doubled customers’ infrastructure costs. Soon scaled-down solutions were offered (“warm” and “cold” sites), which replicated only those portions of the infrastructure that were required to remain operational at all times. Still expensive.

Over the years, there have been a spate of DR solutions, from physical, tape backups that need to be stored off-site, to redundant WAN circuits linked to replicated networks hosted at 3rd party data centers. Regardless of plan or strategy used, there are several elements of DR that most in the industry have always agreed on―DR planning is time-consuming, tough to orchestrate, expensive to test, and definitely not the most glamorous or glorified of responsibilities in the IT industry. DR is a little like being the deep snapper in football. You never hear about the good snaps, only the ones that go over punter’s head and out of the back of the end zone. Not much glory in that.

Here’s what utilizing the Cloud for DR provides…

Quicker Recovery Times

Backing up to the cloud enables customers to recover in a matter of minutes, as opposed to days, sometimes weeks in the event a legacy DR plan is being utilized. Virtualization delivers entire servers, operating systems and applications to a virtual server that can be backed up or copied to an offsite data center. And that virtual server can be spun up on a virtual host in the event a disaster creates the necessity.

Easy Scalability

As opposed to traditional DR solutions (tape backups, redundant data center), utilizing the Cloud for DR means enjoying the flexibility of easily scaling storage capacity up or down based on exact business needs at that time.

Enhanced Security

Arguably the most common myth about the Cloud has to do with security. Actually, that may be one of the best benefits of the Cloud, as things like patch handling and security monitoring are delivered by Cloud providers, such as Azure, AWS or Google Cloud.

Significant Savings

The Cloud’s pay-as-you-go model is incredibly appealing, especially considering the IT industry has been saddled for years with the guilt that comes from waste and inefficiency. Right-sizing any solution has always been the bane of IT professionals; the Cloud provides an answer to that.

Give the Cloud experts a call

If you have questions or concerns about creating and/or implementing a DR plan that will entirely, or partially, incorporate the Cloud, contact GDT’s Cloud practice professionals at CloudTeam@gdt.com. They’re composed of talented Cloud architects and engineers who have successfully deployed Cloud solutions from GDT premier partners AWS, Azure and Google Cloud. They’d love to hear from you.

FlashBlade™ ― an AI answer from a VIP provider

By Richard Arneson

If you are in any way connected to the IT industry, you can’t, and haven’t been able to for years, take a breath without stumbling across the word Flash. With apologies to the superhero created prior to World War II, flash was, as early as twenty (20) years ago, associated with Adobe Flash, the ubiquitous plug-in originally created by Macromedia that allows animations and interactive content to be incorporated into web browsers. Flash forward a few years and now that word is all about memory and storage. While flash storage was initially manufactured in 1992 by SanDisk, the technology didn’t truly sink its teeth into consumers until USB flash drives were introduced to the marketplace at the turn of the century (this century). Since those thumb drives were introduced, however, the word flash and how it’s referenced has come a long, long way.

(If you need a refresher on the relationship between flash memory and flash storage, check this out ― Flash, yes, but is it storage or memory?

Pure Storage―take a guess what they’re experts at?

Pure Storage, as its name implies, focuses on, and specializes in, one (1) hugely important segment of the industry―storage. Started just nine (9) years ago, Pure Storage is time and again voted a leader in its field. If you’re familiar with the Gartner Magic Quadrants, their analysis of solid-state arrays has listed Pure Storage within its coveted upper-righthand “Leader” quadrant in each of the last five (5) years. And if that’s not enough, they’re listed as the most northeastern company in the Leader quadrant. In other words, their “Ability to Execute” and “Completeness of Vision” places them firmly ahead of the other eleven (11) companies researched.

In the IT industry, being a jack of all trades and master of none―whether you’re an engineer, consultant, equipment manufacturer, et al.―can be a risky proposition. It’s possible (see Cisco, Dell EMC and HPE), but it’s far easier to take this approach if you’re in, well, another industry. Let’s face it; the IT industry is a far different animal. It encompasses so much information, thoughts, theories, research and technologies that attempting to master it all is like trying to sop up the Atlantic Ocean with a beach towel.

FlashBlade―another “flash” term you should learn

To dovetail with yesterday’s blog (Artificial Intelligence, Machine Learning and Deep Learning), FlashBlade is Pure Storage’s answer to the growing need for AI (Artificial Intelligence) and that technology’s ability to transform data into intelligence.

Earlier this year, Pure Storage joined forces with NVIDIA, the 20-year-old PC gaming company, to create what they’re calling AIRI, which stands for AI-Ready Infrastructure. Gaming aside, NVIDIA created the GPU (graphics processing unit), which has exponentially more processors per chip than CPUs. GPUs are optimized specifically for data computations, and they’re much smaller than a CPU, which means more of them can be jammed onto a single chip. And because AI, Machine Learning and Deep Learning must calculate computations with huge amounts of data, the GPUs can perform up to ten (10x) times better than their CPU counterparts.

The Pure Storage and NVIDIA AIRI is specifically built for deep learning environments and delivers a fully integrated platform that provides an out-of-the-box, scaled-out AI solution. The rack-scale architecture allows customers to add additional blades based on their specific AI needs, and to do so without any data migration or downtime.

Ultimately, AIRI was created to help customers more easily dip their toes into the AI waters with a low-latency, high-bandwidth, out-of-the-box solution, all in a compact, 4U form factor.

An even simpler solution…

The tenured, talented engineers and solutions architects at GDT are experienced at delivering advanced, cutting-edge solutions for enterprises, service providers and government agencies of all sizes. If you have questions about GDT premier partner Pure Storage and what their products and solutions can provide to your organization, contact them at Engineering@gdt.com or at SolutionsArchitects@gdt.com. They’d love to hear from you.

A-M-D-I-L-L: Unscrambled, these letters represent some of the hottest topics in the IT Industry  

By Richard Arneson

His name might not carry the same weight as Abner Doubleday’s, who is credited with inventing baseball in the early- to mid-1800’s, but Walter Camp is the person widely regarded as the creator of America’s most popular current sport―football. It’s impossible to know exactly what Camp envisioned for football, his amalgamation of soccer and rugby that he invented roughly fifty (50) years after Doubleday’s, but this much is certain―he never imagined it would be used as an analogy to describe Artificial Intelligence (AI), Machine Learning (ML) and Deep Learning (DL).

Artificial Intelligencethe game

Given the fact that it came before ML, which came before DL, AI, like football, has no predecessor.To borrow from mathematics, AI is the superset of subsets ML and DL. And like Camp’s invention, pinpointing the creation date of AI is next to impossible. While the use of the name Artificial Intelligence is widely attributed to John McCarthy, who used it during a Dartmouth academic conference in 1956, its actual invention is up for debate.

However, here’s what is widely agreed upon―AI sets out to utilize a machine to mimic human thinking. For years―decades in fact―the public’s understanding of AI took was largely the result of science fiction writers, who penned, among countless other sci-fi films, 2001: A Space Odyssey, Westworld and Bladerunner. Presently, AI is taking it on the chin because it’s feared that it will take jobs from people and that those smart devices are covertly gathering way, way too much information about its users.

Today AI is utilized by too many applications and appliances to name, but the most common are Netflix, Amazon’s Alexa, Apple’s Siri and Nest, the learning thermostat that Google purchased four (4) years ago. While some might argue that those hardly represent the benefits of AI, there are certainly examples of how it can deliver to humans a better qualify of life. For instance, there are new AI platforms capable of providing health advice, including specific diagnoses, to people who can’t afford medical care or access medical facilities.

Machine Learningthe players and the plays they run

Machine Learning (ML) takes AI to the next level. It’s not uncommon to hear ML and AI used interchangeably―they shouldn’t, they’re different. Football isn’t players, but the game in which they play. While AI addresses “If A happens, then B needs to happen”, ML, instead, determines that “If A happens, then I’ll learn what should happen next.” Yes, the machine, as the name suggests, thinks. In the case of Amazon.com, ML algorithms gather the type of movie, book or song that you enjoy, then look to see what others who share your same interests are into.

Deep Learningthe dekes, fakes and cuts

To remain with the football analogy, if AI is the game and ML represents the thinking players utilize to carry out the plays, DL is what allows a player to improvise in the event a defender stands between them and the goal line. DL attempts to enable machines to draw conclusions. Deep Learning is a type of Machine Learning, just its next evolution.

In the event this comes up in Trivial Pursuit, the word Deep in DL is borrowed from deep artificial neuron networks, which is another way of referencing DL. When, or if, you ever hear deep artificial neuron networks, you’ve just heard a synonym for DL. And in case you’re wondering, neuron refers to the interactions and interconnections that exist between the neurons in the human brain. Yes, the thinking human brain.

The best part about AI, ML and DL

Whether or not you realize it, you’re only a few clicks away from learning more about AI, ML and DL by accessing some of most talented and experienced solutions architects and engineers in the industry. GDT’s engineering and technical expertise has delivered solutions to companies of all sizes and from a wide variety of industries. In addition to enterprises, GDT lists as clients some of the most notable service providers and government agencies in the world. You can reach them at Engineering@gdt.com or at SolutionsArchitects@gdt.com. They’d love to hear from you.

What is FedRAMP, and why is it mandatory for federal agencies?

By Richard Arneson

Politically speaking, people want the government to intervene either more or less, but there’s something we can all agree on—FedRAMP is a good thing. FedRAMP is short for Federal Risk and Authorization Management Program, which is another way of saying Keeping federal agencies’ data safe when using cloud services. Now, instead of agencies deploying cloud applications and services willy-nilly (see unsecured), they can safely turn to a cloud services provider (CSP) that has earned FedRAMP accreditation. In addition to ensuring that agencies receive the highest levels of cloud security, it also enables them to save considerable time and money that they’d otherwise spend assessing providers. Here’s another thing we can all agree on―government waste is a bad thing. FedRAMP addresses that.

The FedRAMP certification process

Becoming FedRAMP-certified is not like getting a driver’s license, where a few classes are taken, a simple exam is passed, and a seal of approval stamped and the certification issued. Getting FedRAMP-certified is an extensive process, and it should be. Not to downplay the importance of enterprises’ mission critical information, but when it comes to government data, the safety of about 330,000,000 U.S. Citizens is at stake.

Even though FedRAMP was introduced over seven (7) years ago by the U.S. Office of Management and Budget, there are currently only about one hundred (100) providers that are FedRAMP-certified. Each are broken out into one (1) of three (3) service models: IaaS, PaaS and SaaS. A handful are certified in more than one (1) service model, and that list is primarily composed of a few companies with which we’re pretty familiar―Google, Microsoft, AWS (Amazon Web Services) and Salesforce.

Providers can get FedRAMP certified in one (1) of two (2) ways, either through a JAB (Joint Authorization Board) provisional authorization (P-ATO) or a select agency, known as Agency Authority to Operate (ATO).

Joint Authorization Board provisional authorization (JAB P-ATO)

The JAB includes representatives from the Department of Defense (DoD), the Department of Homeland Security (DHS) and the General Services Administration (GSA). Their vetting process is so extensive that they authorize only three (3) CSPs per quarter. First, however, the provider must prove that there has been a demonstrated demand for their service by a wide array of agencies. That initial hurdle knocks a huge percentage of applicants out of the running.

Extensive security assessments are conducted by the JAB, after which they conduct, with the applicant, a collaborative deep-dive into their cloud offerings, architecture, and capabilities (especially as it relates to security). A thorough Q&A session caps off the application process, after which the JAB makes their decision to grant, or not grant, FedRAMP authorization.

Agency Authority to Operate (ATO)

The FedRAMP authorization process has taken into consideration CSOs that have only a few agencies interested in their services, or if they have designed a cloud for a particular agency. In this case―and because it’s required that agencies only utilize FedRAMP-authorized providers―the provider would apply for certification through the ATO process. Basically, it allows for agencies to gain certification on an as-needed basis.

The ATO process requires that the CSP formalize their partnership with a particular government agency. First, however, their service must be fully built and functional. It’s up to the agency to analyze and approve the applicant’s SSP (System Security Plan), after which a Security Assessment Plan (SAP) needs to be developed with a 3PAO (3rd party assessment organization). 3PAOs are organizations selected by the U.S. government to evaluate agencies and test their SAP to ensure it is FedRAMP compliant.

Which certification process to choose?

JAB is good for providers offering services that can be utilized by multiple agencies. ATO best for those providers that have developed what can best be described as a niche offering. FedRAMP doesn’t want to exclude agencies from being able to access a particular service if it perfectly meets their needs. Hence, the ATO process. But regardless of which authorization process providers elect to choose (and it is up to them), the goals are the same―secure and diverse cloud services options for federal agencies.

Even if you’re not a government agency…

Utilizing a cloud service provider that is FedRAMP-certified provides organizations a peace of mind, whether they are a federal agency or not, in knowing that they’ve selected a company that has been carefully, and laboriously, vetted by the U.S. government. And that perfectly describes GDT, which has been FedRAMP-certified for years and secures the government cloud for agencies of all sizes. In addition, they provide cloud services for enterprises and service providers of all sizes, and from a variety of industries. You can contact GDT’s talented solutions architects and engineers at SolutionsArchitects@gdt.com or at Engineering@gdt.com. They’d love to hear from you.

Understanding the Attack Surface

By Richard Arneson

Leave it Hollywood to allow the smallest attack surface in history to be breached. In the first Star Wars movie, the Death Star, which appeared to be only slightly smaller than Earth, had a tiny aperture that, if penetrated, would magically destroy the entire, menacing orb. Naturally, it was hit―it’s Hollywood. Unfortunately, the attack surface of organizations, at least in terms of networking, is quite a bit larger, probably far more so than you’d think.

The Attack Surface

Attack Surface refers to the collective sum of all points of entry or interaction that are vulnerable to malware, worms, Trojans, hackers, you name it. Attack Surfaces encompass three (3) areas of vulnerability: the network, the applications that traverse it, and people, or employees, who happen to pose the greatest security threat to organizations.

Network

The bad guys are looking for networks with multiple interfaces; the more the better. Take tunnels, for instance, which are constructed between communication points through data encapsulation―they can pose a huge threat to network security. For data transmission, Point-to-Protocol (PPP) and VPNs encapsulate non-routable data inside routable data. When data arrives at its intended destination, the outer packet is stripped off, which allows the inner data to enter the private network. Here’s one of the issues: it’s difficult to know exactly what has been encapsulated, which can inadvertently provide a protective shield for hackers. Talk to the folks at Home Depot or Target; they’ll tell you about VPN-related security vulnerabilities.

Any outward-facing, open ports (which means they’re open to receiving packets) can add to a network’s Attack Surface by revealing information about a particular system, even the network’s architecture. Open ports sound negligent, even irresponsible, but they’re necessary in certain situations. For instance, think back to when you set up your personal e-mail account and entered ingoing and outgoing port numbers. Those are open ports, but not adding, or opening, them means you can’t send or receive your emails. Yes, open ports are often needed, but can open the door to unseemly intentions.

Software

Thanks to the rapid evolution of Cloud services, new applications to access it are being developed by the minute. Hackers, as well, are creating ways in which to access and exploit them…by the minute. The more code that is accessed and executed, the more code is exposed to users, including those of the unauthorized variety.

No question, cloud computing has greatly added to the complexity of securing vital data. The proliferation of applications requires commensurate security measures.

The Human Factor

As previously mentioned, employees, or authorized users, far and away produce the greatest security threats to organizations; they significantly expand the Attack Surface. Unauthorized applications are downloaded, emails from unknown senders are opened, and authorizations aren’t turned off after an employee leaves the company. And if they’re disgruntled ex-employees, the Attack Surface just got bigger. Even Instant Messaging programs can crack open a once, or believed to be, closed security door.

Attack Surface Questions? Turn to the Security Experts

Attack Surfaces, whether minimal or broad in scope, cost organizations worldwide over $2 trillion. Talking to the security experts at GDT should be your first order of business. Believing a security breach won’t happen to your company is setting you up for grave, and expensive, consequences in the future. From its state-of-the-art, 24x7x365 Security Operations Center (SOC), GDT’s security analysts and engineers manage and monitor network security for some of the most noted enterprises, service providers and government entities in the world. Contact them today at SOC@gdt.com. They’d love to hear from you.

HPE’s recent acquisition of Plexxi gives it a leg up on its composable competitors

By Richard Arneson

In May of this year, HPE announced its purchase of Plexxi, an eight-year-old, Boston-based company that set the IT world on fire based on this one (1) idea: data center networking needed to be less complicated, yet more powerful. They combined software-defined networking with intent-based automation that addressed workload and infrastructure awareness to revolutionize the way networks are managed. The result? Simplified tasks, increased efficiency, and reductions in complexity and costs.

With its purchase of Plexxi, HPE greatly enhanced its software-defined portfolio by combining Plexxi’s Next-Gen data center fabric with its existing software-defined infrastructure. HPE customers will be better equipped to enjoy a true cloud-like experience in their data center. Automatic creation and re-balancing of bandwidth will be able to perfectly address the needs of specific workloads, and applications can be deployed faster. Customers will be able to better, and faster, harness the true value of their data.

The two (2) Clear Values HPE is receiving as a result of its Plexxi acquisition

HPE is integrating Plexxi’s technology into its already robust hyperconverged solutions, which are the result, in part, of their 2017 purchase of SimpliVity. According to HPE, “The purchase of Plexxi will enable us to deliver the industry’s only hyperconverged offering that incorporates compute, storage and data fabric networking into a single solution, with a  single management interface and support.”

HPE anticipates two (2) key, clear opportunities from its Plexxi purchase:

  1. The combination of the Plaxxi and SimpliVity solutions and technologies will deliver to customers a dynamic, workload-based model that will much better align IT with their business goals. Prior to the Plaxxi acquisition, Gartner’s Magic Quadrant for hyperconvergence already listed HPE as one the industry’s leaders. With Plexxi, their lead just got longer.
  2. Secondly, Plexxi’s technology will enhance HPE Synergy, its existing composable infrastructure portfolio that offers pools of resources for storage and compute. HPE Synergy is built on HPE OneView, which enables users, from a single interface, to accelerate applications and service delivery, and allows logical infrastructures to be composed or recomposed at (near) instant speeds.

HPE, at last count, has almost 1,500 composable infrastructure customers to date. Now throw Plexxi into the mix, and that number will get bigger, in a hurry.

First, turn to the HPE and composable infrastructure experts at GDT

HPE is one of GDT’s premier partners, and their solutions and products have been architected, engineered, deployed and monitored by GDT for enterprises, government entities and some of the largest service providers in the world. GDT’s talented solutions architects and engineers are experts in delivering composable infrastructure solutions—including, of course, HPE Synergy—and helping organizations of all sizes enjoy its many benefits. You can contact them at SolutionsArchitects@gdt.com or Engineering@gdt.com. They’d love to hear from you.

When Containers need a conductor―that’s Orchestration    

By Richard Arneson

Containers, if you recall from last week’s blogs, pull from the application layer and package code and related application dependencies into one (1) neat, tidy package. Remember, this provides a step up from hypervisors, which require each VM to run their own OS, making them less efficient, especially when heavy scaling is required. There are other benefits of containers, of course, and you can refresh your memory here – VM, Hypervisor or Container?.

But the greatness of containerization―a fast, easy way to test and implement apps, address ever-fluctuating demands of users, quickly move apps between servers, et al.― can lead to management issues. The more containers that are created, the more inventory is created to maintain and manage. ZZ Top (3 members) doesn’t need a conductor, but when the New York Philharmonic (over a hundred) plays Beethoven’s 9th, it’s a must. And in the case of containerization, the conductor is called, appropriately, Orchestration.

Orchestration―making beautiful Container music

Orchestration software delivers a management platform for containers and helps define any relationships that exist between them. It can address containers’ need to scale, including how they talk to the world around them.

In short, Orchestration manages the creation, upgrading and availability of multiple containers, and controls connectivity between them. Entire container clusters can be treated as single deployments.

In addition, Orchestration provides:

  • A single, virtual host that can cluster multiple hosts together, all accessible through a single API.
  • Ease of host provisioning, and invalid nodes can be detected and automatically re-scheduled.
  • Linking of containers, including clusters maintained within containers.
  • The ability to control exactly when containers start and stop, and can group them into clusters, which can be formed for multiple containers that have common requirements. Clusters = easier management and monitoring.
  • The ability to easily handle processes related to an application, and included toolsets enable users to better steer deployments.
  • Automated updates, including the “health” of containers, and the ability to implement failover procedures.

We’re living in an Application-Centric world

Applications get larger and more complex with each passing day, but without containerization (and Orchestration), their need to work harmoniously is unwieldy, time-consuming, expensive and takes personnel off the key projects and initiatives that will keep their organization competitive in the marketplace. If there’s a need to develop, test and deploy sophisticated applications, Containers and Orchestration can help you play the right tune.

Turn to the engineers and solutions architects at GDT for more information about Containers and Orchestration

The talented technical professionals at GDT are experienced at helping customers enjoy the many benefits that Containers and Orchestration can deliver. They work with organizations of all sizes, and from a wide variety of industries, including government and service providers. They can be reached at SolutionsArchitects@gdt.com or at Engineering@gdt.com. They’d love to hear from you.

Virtual Machine or Container…or Hypervisor? Read this, and you can make the call

By Richard Arneson

Containers have been around for years, but we’ll leave its history for another blog. Hypervisors, if you recall, are software that manage virtual machines (VMs), each of which can run its own programs but gives the appearance of running the host hardware’s memory, processor and resources. Hypervisors are, basically, a platform for VMs. But don’t be surprised to hear hypervisor and VM used interchangeably; they shouldn’t be, but it’s not uncommon. Just remember―hypervisors are the software that run VMs.

They’re both Abstractions, but at different layers

Hypervisors (VMs)―physical layer

Abstractions relate to something that’s pulled, or extracted, from something else. Hypervisors abstract physical resources, such as those listed above (memory, processor, and other resources), from the host hardware. And those physical resources can be abstracted for each of the virtual machines. The hypervisor abstracts the resources at a physical level, capable of, as an example, turning a single server into many, thus allowing for multiple VMs to run off a single machine. VMs run their own OS and applications, which can take up loads of resources, even boot up slowly.

Containers―application layer

Containers are, again, an abstraction, but pull from the application layer, packaging code and related dependencies into one (1) happy family. What’s another word for this packaging? Yep, containerization.

What are the benefits of containers over VMs?

Application Development

There are several benefits related to containers, but we’ll start with the differentiator that provides the biggest bang for the buck. Prior to containers, software couldn’t be counted on to reliably run when moved to different computing environments. Let’s say DevOps wants to move an application to a test environment. It might work fine, but it’s not uncommon for it to work―here’s a technical term―squirrelly. Maybe tests are conducted on Red Hat and production will be on, say, Debian. Or both locations have different versions of Python. Yep, squirrelly results.

In short, containers make it far easier for software developers by enabling them to know their creations will run, regardless of where they’ve been deployed.

Efficiency

Containers take up far less space than VMs, which, again, run their own OS. In addition, containers can handle more applications and require fewer VMs. Make no mistake, VMs are great, but when heavy scaling is required, you may find yourself dedicating resources that are, basically, managing a spate of operating systems.

And consider moving workloads between vendors with VMs. It’s not as simple as dragging an application from one OS to the other. A vSphere-based VM can’t have associated workloads moved to, say, Hyper-V.

Microservices

Microservices, which can run in containers, break down applications into smaller, bite-sized chunks. It allows different teams to easily work independently on different parts or aspects of an application. The result? Faster software development.

No, containers don’t mark the end of VMs and Hypervisors

In fact, containers and VMs don’t need to be mutually exclusive. VMs and containers can co-exist beautifully. As an example, a particular application may need to talk to a database on a VM. Containers can easily accommodate this particular scenario.

Sure, containers are efficient, self-contained systems that allow applications to run, regardless of where they’ve been deployed. But containers might not be the best option for all situations. And without expertise within IT departments to understand this difference, it will probably leave you wondering which―VMs or containers―will be the most beneficial to your organization. And, again, it might not be an either/or situation. For instance, as containers utilize one OS, it could, if you don’t have security expertise, leave you more open for security breaches than if utilizing VMs. Your best bet? Talk to experts like those as GDT.

Please, use your resources

You won’t find better networking resources than GDT’s talented solutions architects and engineers. They hold the highest technical certifications in the industry and have designed and implemented complex networking solutions for some of the largest enterprises and service providers in the world. They can be reached at SolutionsArchitects@gdt.com or at Engineering@gdt.com. They’d love to hear from you.

Shadow IT―you might be a participant and don’t even know it

By Richard Arneson

Everybody loves the cloud, and why wouldn’t they? The amount of innovation and productivity it has brought to businesses worldwide has been staggering. Where Salesforce once appeared to stand alone as the only cloud-based software service, it’s been joined over the past few years by thousands of applications that were once individually loaded on PCs (Office 365, the Adobe Creative Suite and WordPress come to mind). But with the good comes the bad―more accurately, the concerns―and, in the case of The Cloud, you can list issues related to security, governance and compliance as those that counterbalance the positive side of the Cloud ledger.

Shadow IT

Not to paint everybody with the same, broad brush stroke, but the preponderance of workers either have participated in Shadow IT, or continue to do so (it’s primarily the latter). Shadow IT refers to information technology that operates and is managed without the knowledge of the IT department―doesn’t sound very safe and secure, does it? Have you ever downloaded software that helps accomplish a task or goal without the knowledge of IT? Probably, right? That’s Shadow IT. But that’s not to say Shadow IT participants are operating with devious intentions; they do it for a variety of reasons, such as a need for expediency, or perhaps because corporate red tape, including required pre-requisites, preclude it. Participants’ goals―efficiency, productivity―may be noble and spot-on, but their actions can create a host of security headaches and issues at some point in the future. And there’s a very good chance it will. It’s estimated that within one (1) year, data breaches worldwide will cost organizations a collective $2.1 trillion. Oh, and the United States has the highest cost per breach ($7.9 million) in the world. Shadow IT helps buoy those numbers. Thinking a security issue only happens to the other guy is living in a fool’s paradise.

Cloud Access Security Brokers (CASB)

Sending out policies and conducting training for employees regarding computer and network use is great, and strongly encouraged, but counting on everybody adhering to these mandates is unreasonable and impractical, especially if your company has tens of thousands of workers scattered throughout the world.

To address the issue of Shadow IT, the industry has developed Cloud Access Security Brokers (no, they’re not people, but software), the name given by Gartner five (5) years ago that describes cloud security solutions centered around four (4) pillars: visibility, compliance, data security and threat protection. CASB is software planted between a company’s IT infrastructure and the cloud, and is now offered by several vendors, including Cisco―its CASB solution is called CloudLock (you can read about it here – Cisco CloudLock).

CASB utilizes an organization’s security policies to secure the flow of data to and from its IT infrastructure and the cloud. It encrypts data and protects it from malware attacks, provides encrypted data security, and helps defend protect against the scourge that is Shadow IT.

For more information…

With the help of its state-of-the-art Security Operations Center (SOC), GDT’s team of security professionals and analysts have been securing the networks of some of the most noteworthy enterprises and service providers in the world. They’re highly experienced at implementing, managing and monitoring Cisco security solutions. You can reach them at SOC@gdt.com. They’d love to hear from you.

What exactly is a Network Appliance?

By Richard Arneson

We work in an industry rife with nomenclature issues. For instance, Hybrid IT is often used interchangeably with Hybrid Cloud―it shouldn’t, they’re different. They were even referred to as such in an “also known as” manner within a beautiful, 4-color brochure produced by one of the leading equipment vendors in the IT industry. I’ve seen hyperconverged substituted for converged, SAN confused with NAS, SDN and SD-WAN listed as equivalents. The list is seemingly endless.

The good news? Getting the answer is pretty easy, and only a few clicks away. Yes, Google is, for most, the answer to getting correct answers. Ask it a question, then read through the spate of corresponding articles from reputable sources, and you can generally deduce the right answer. When ninety-eight (38) answers say it’s A, and one (1) claims it’s B―it’s probably A.

When does “it” become an Appliance?

Sitting in a non-company presentation recently, I heard the word appliance used several times, and, even though I’ve been in the IT and telecommunications industry for years, I realized I didn’t technically know what appliance meant, how it was different than other networking equipment. I turned to the person seated at my left and asked, “What’s the difference between an appliance and a piece of networking equipment, be it a router, server, etc.?” The answer he provided offered little help. As an attempt to hide my dissatisfaction, I quietly whispered the same question to an engineer on my right. His answer could be only slightly construed as similar to the first response―slightly. In fact, the only true commonality between the answers came in the form of two (2) words―single function. Clear as Mississippi mud pie, right? During a break, I asked the question of several in attendance, and got answers that ran a mile wide and an inch deep, but provided, essentially, little information, possibly less than before.

I turned to Google, of course. But I discovered something I didn’t believe was possible―there was literally no definition or information I could find that even attempted to distinguish what, exactly, makes for a network appliance. According to “my history” in Google Chrome, I typed in over thirty (30) variations of the same question. Nothing. Frustrating. But I had something better than Google.

It works with governmental elections

GDT has over two-hundred (200) solutions architects and engineers, all talented and tenured, and have earned, collectively, well over one thousand (1,000) of the industry’s highest certifications. Why not poll some of the industry’s best and brightest with the question,” What differentiates an ‘appliance’ from other networking equipment?”

They weren’t allowed to reply “TO ALL” in the hopes that others’ answers wouldn’t influence theirs. Also, they couldn’t Google the question, or any derivative thereof, which, based on my experience, wouldn’t have helped anyway.

Drum roll, please

Responses came pouring in, even though it was after 5 PM on a Friday afternoon. So in lieu of posting well over one hundred (100) responses, I decided to craft, based on those responses (one was even a haiku), a definition of a network appliance related to how it’s differentiated from a non-appliance. Here goes…

A network appliance is different than a non-appliance because it comes pre-configured and is built with a specific purpose in mind.

And because I’m a fan of analogies, here’s one I received:

“You can make toast in the oven, but you’ve got a toaster, a device that is specifically made for making toast. Because it’s designed for a narrow problem set, the toaster is smaller than the oven, more energy efficient, easier to operate, and cheaper. An appliance is something that is able to be better than a general-purpose tool because it does less.”

And for you Haiku fans:

“It is a server

Or a virtual machine

That runs services”

There it is―a definition, an analogy, even a Haiku. Now don’t get me started on the word device.

Turn, like I did, to the experts

GDT’s team of solutions architects and engineers maintain the highest certification levels in the industry. They’ve crafted, installed and currently manage the networks and security needs of some of the largest enterprises and service providers in the world. They can be reached at SolutionsArchitects@gdt.com or at Engineering@gdt.com. Great folks; they’d love to hear from you.

Riding the Hyperconvergence Rails

By Richard Arneson

If your organization isn’t on, or planning to get on, the road to hyperconvergence (HCI), you may soon be left waiving at your competitors as the HCI train flies by. A recent industry study found that approximately 25% of companies currently use hyperconvergence, and another 23% plan on moving to it by the end of this year. And those percentages are considerably higher in certain sectors, such as healthcare and government. In addition to the many benefits HCI delivers—software-defined storage (SDS), an easier way to launch new cloud services, modernization of application development and deployment, and far more flexibility for data centers and infrastructures—it is currently providing customers, according to the study, an average of 25% in OPEX savings. It might be time to step up to the ticket window.

All Aboard!

If you haven’t heard about Dell EMC’s VxRail appliances, it’s time you do―they’ve been around for about two (2) years now. In that first year alone, they sold in excess of 8,000 nodes to well over 1,000 customers. And in May of this year, they announced a significant upgrade to their HCI portfolio with the launch of more robust VxRail appliances, including significant upgrades to VxRack, its Software-Defined Data Center (SDDC) system. VxRail was closely developed with VMware, of which Dell EMC owns eighty percent (80%).

The VxRail Portfolio of Appliances

All VxRail appliances listed below offer easy configuration flexibility, including future-proof capacity and performance with NVMe cache drives, 25GbE connectivity, and NVIDIA P40 GPUs (graphics processing units). They’re all built on Dell EMC’s latest PowerEdge servers, which are powered by Intel Xeon Scalable processors, and are available in all-flash or hybrid configurations.

G Series―the G in G-Series stands for general, as in general purpose appliance. It can handle up to four (4) nodes in a 2U chassis.

E Series―whether deployed in the data center or at the edge (hence the letter E), the E Series sleek, low-profile can fit into a 1U chassis.

V Series―the V stands for video; it is VDI-optimized graphics ready and can support up to three (3) graphics accelerators to support high-end 2D or 3D visualization. The V Series appliance provides one (1) node in its 2U profile.

P Series―P for performance. Each P Series appliance is optimized for the heaviest of workloads (think databases). Its 2U profile offers one (1) node per chassis.

S Series―Storage is the operative word here, and the S Series appliance is perfect for storage dense applications, such as Microsoft Exchange or Sharepoint. And if big data and analytics are on your radar screen, the S Series appliance is the right one for you. Like the P and V Series appliances, the S Series provides one (1) node in its 2U profile.

And to help you determine which VxRail appliance is right for your organization, Dell EMC offers a nifty, simple-to-use XRail Right Sizer Tool.

Perfect for VMware Customers

VMware customers are already familiar with the vCenter Server, which provides a centralized management platform to manage VMware environments. All VxRail appliances can be managed through it, so there’s no need to learn a new management system.

Questions about Hyperconvergence or VxRail?

For more information about what hyperconvergence, including what Dell EMC’s VxRail appliances can provide for your organization, contact GDT’s solutions architects and engineers at SolutionsArchitects@gdt.com. They hold the highest technical certification levels in the industry, and have designed and implemented hyperconverged solutions, including ones utilizing GDT partner Dell EMC’s products and services, for some of the largest enterprises and service providers in the world. They’d love to hear from you.

When good fiber goes bad

By Richard Arneson

Fiber optics brings to mind a number of things, all of them great: speed, reliability, high bandwidth, long distance transmission, immune to electromagnetic interference (EMI), and strength and durability. Fiber optics is comprised of fine glass, which might not sound durable, but flip the words fiber and glass and you’ve got a different story.

Fiberglass, as the name not so subtly suggests, is made up of glass fibers―at least partially. It achieves its incredible strength once it is combined with plastic. Originally used as insulation, the fiberglass train gained considerable steam in the 1970’s after asbestos, which had been widely used for insulation for over fifty (50) years, was found to cause cancer. But that’s enough about insulation.

How Fiber goes bad

As is often the case with good things, fiber optics doesn’t last forever. Or, it should be said, it doesn’t perform ideally forever. There are several issues that prevent it from delivering its intended goals.

Attenuation

Data transmission over fiber optics involves shooting light between input and output locations, and if the light intensity degrades, or loses its power, it’s known as attenuation. High attenuation is bad; low is good. There’s actually a mathematical equation that calculates the degree of attenuation, and this sum of all losses can be caused by a degradation in the fiber itself, poor splice points, or any point or junction where it’s connected.

Dispersion

When you shine a flashlight, the beam of light disperses over distance. This is dispersion. It’s expected, usually needed, when using a flashlight, but not your friend when it occurs in fiber optics. In fiber, dispersion occurs as a result of distance; the farther it’s transmitted, the weaker, or more degraded, the signal becomes. It must propagate enough light to achieve the bare minimum required by the receiving electronics.

Scattering

Signal loss or degradation can exist when there are microscopic variations in the fiber, which, well, scatters the light. Scattering can be caused by fluctuations in the fiber’s composition or density, and are most often due to issues in manufacturing.

Bending

When fiber optic cables are bent too much (and yes, there’s a mathematical formula for that), there can be a loss or degradation in data delivery. Bending can cause the light to be reflected at odd angles, and can be due to bending of the outer cladding (Macroscopic bending), or bending within it (Microscopic bending).

To the rescue―the Fiber Optic Characterization Study

Thankfully, determining the health of fiber optics doesn’t rely on a Plug it in and see if it works approach. It’s a good thing, considering there is an estimated 113,000 miles of fiber optic cable traversing the United States. And that number just represents “long haul” fiber, and doesn’t include fiber networks built within cities or metro areas.

Fiber Characterization studies determine the overall health of a fiber network. The study consists of a series of tests that ultimately determine if the fiber in question can deliver its targeted bandwidth. As part of the study, connectors are tested (which cause the vast majority of issues), and the types and degrees of signal loss are calculated, such as core asymmetry, polarization, insertion and optical return loss, backscattering, reflection and several types of dispersion.

As you probably guessed, Fiber Characterization studies aren’t conducted in-house, unless your house maintains the engineering skill sets and equipment to carry it out.

Questions about Fiber Characterization studies? Turn to the experts

Yes, fiber optics is glass, but that doesn’t mean it will last forever, even if it never tangles with its arch nemesis―the backhoe. If it’s buried underground, or is strung aerially, it does have a shelf life. And while its shelf life is far longer than its copper or coax counterparts, it will degrade, then fail, over time. Whether you’re a service provider or utilize your own enterprise fiber optic network, success relies on the three (3) D’s―dependable delivery of data. A Fiber Characterization Study will help you achieve those.

If you have questions about optical networking, including Fiber Characterization studies, contact The GDT Optical Transport Team at Optical@gdt.com. They’re highly experienced optical engineers and architects who support some of the largest enterprises and service providers in the world. They’d love to hear from you.

The Hyper in Hyperconvergence

By Richard Arneson

The word hyper probably brings to mind energy, and lots of it, possibly as it relates to a kid who paints on the dining room wall or breaks things, usually of value. But in the IT industry, hyper takes on an entirely different meaning, at least when combined with its compound counterpart―visor.

Hyperconvergence, in regards to data center infrastructures, is a step-up from convergence, and a stepping stone to composable. And, of course, convergence is an upgrade from traditional data center infrastructures, which are still widely used but eschew the use of, among other things, virtualization. Traditional data center infrastructures are heavily siloed, requiring separate skill sets in storage, networking, software, et al.

The Hypervisor―the engine that drives virtualization

Another compound word using hyper is what delivers the hyper in hyperconvergence ― hypervisor. In hyperconvergence, hypervisors manage virtual machines (VMs), each of which can run its own programs but gives the appearance of running the host hardware’s memory, processor and resources. The word hypervisor sounds like a tangible product, but it’s software, and is provided by, among others, market leaders VMware, Microsoft and Oracle. This hypervisor software is what allocates those resources, including memory and processor, to the VMs. Think of hypervisors as a platform for virtual machines.

Two (2) Types of Hypervisors

Hypervisors come in two (2) flavors, and deciding between either comes down to several issues, including compatibility with existing hardware, the level and type of management required, and performance that will satisfy your organization’s specific needs. Oh, and don’t forget budgetary considerations.

Bare-Metal – Type 1

Type 1 hypervisors are loaded directly onto hardware that doesn’t come pre-loaded with an Operating System. Type 1 hypervisors are the Operating System, and are more flexible, provide better performance and, as you may have guessed, are more expensive than their Type 2 counterparts. They’re usually single-purpose servers that become part of the resource pools that support multiple applications for virtual machines.

Hosted – Type 2

A Type 2 hypervisor runs as an application loaded in the Operating System already installed on the hardware. But because it’s loaded on top of the existing OS, it creates an additional layer of programming, or hardware abstraction, which is another way of saying less efficient.

So which Type will you need?

In the event you’re looking to move to a hyperconverged infrastructure, both the type of hypervisor, and from which partner’s products to choose, will generate a spate of elements to evaluate, such as the management tools you’ll need, which hypervisor will perform best based on your workloads, the level of scalability and availability you’ll require, and, of course, how much you’ll be able to afford.

It’s a big decision, so consulting with hyperconvergence experts should probably be your first order of business. The talented solutions architects and engineers at GDT have delivered hyperconvergence solutions to enterprises and service providers of all sizes. They’d love to hear from you, and can be reached at SolutionsArchitects@gdt.com.

How does IoT fit with SD-WAN?

By Richard Arneson

Now that computing has been truly pushed out to the edge, it brings up questions about how it will mesh with today’s networks. The answer? Very well, especially regarding SD-WAN.

IoT is comprised of three types of devices that make it work―sensors, gateways and the Cloud. No, smart phones aren’t one of the devices listed. In fact, and for simplicity’s sake, let’s not call smart phones devices. The technology sector is particularly adept at incorrectly utilizing words interchangeably. In this case, the confusing word is device. For instance, when you hear statistics about the estimated number of connected devices to be over 20 billion by 2020, smart phones are not part of that figure. While smart phones are often called devices and do have sensors that can detect tilt (gyroscope) and acceleration (accelerometer), IoT sensors extend beyond those devices (oops, I did it again; let’s call them pieces of equipment) that provide Internet connectivity―laptops, tablets and, yes, smart phones.

Sensors and Gateways and Clouds…oh my

Sensors are the edge devices, and can detect, among other things, temperature, pressure, water quality, existence of smoke or gas, et al. Think Ring Doorbell or Nest Thermostat.

The gateway can be either in hardware or software (sometimes both), and is used for the aggregation of connectivity, encryption and decryption of the IoT data.  Gateways translate protocols used in IoT sensors, including management, onboarding (storage and analytics) and edge computing. Gateways, as the name suggests, serve as a bridge between IoT devices, their associated protocols, such as Wi-Fi or Bluetooth, and the environment where the gathered data gets utilized.

SD-WAN and IoT

SD-WAN simplifies network management―period. And a subset of that simplicity comes in the form of visibility and predictability, which is exactly what IoT needs. SD-WAN can help ensure IoT devices in remote locations will get the bandwidth and security needed, which is especially important considering IoT devices don’t maintain a lot of computing power (for example, they usually don’t have enough to support Transport Layer Security (TLS)).

SD-WAN allows network managers the ability to segment traffic based on type―in this case, IoT―so device traffic can always be sent over the most optimal path. And SD-WAN traffic can be sent directly to a cloud services provider, such as AWS or Azure. In traditional architectures, such as MPLS, the traffic has to be backhauled to a data center, after which it is handed off to the Internet. Hello, latency―not good for IoT devices that need real-time access and updating.

SD-WAN is vendor-agnostic, and can run over virtually any existing topology, such as cellular, broadband and Wi-Fi, which makes it easier to connect devices in some of the more far-flung locations. And management can be accomplished through a central location, which makes it easier to integrate services across the IoT architecture of your choosing.

As mentioned earlier, there will be an estimated 20 billion IoT devices in use by 2020, up from 11 billion presently (by 2025…over 50 billion). The number of current endpoints being used is amazing, but the growth rate is truly staggering. And for IoT to deliver on its intended capabilities, it needs a network that can help it successfully deliver access to real-time data. That sounds like SD-WAN.

Here’s a great resource

To find out more about SD-WAN and exactly how it provides an ideal complement to IoT, contact GDT’s tenured SD-WAN engineers and solutions architects at SDN@gdt.com. They’ve implemented SD-WAN and IoT solutions for some of the largest enterprise networks and service providers in the world. They’d love to hear from you.

Unwrapping DevOps

By Richard Arneson

As the name suggests, DevOps is the shortened combination of two (2) words―development and operations. Originally, application development was time-consuming, fraught with errors and bugs, and, ultimately, resulted in the bane of the business world―slow to market.

Prior to DevOps, which addresses that slow to market issue, application developers worked in sequestered silos. They would collaborate with operations at a minimum, if at all. They’d gather requirements from operations, write huge chunks of code, then deliver their results weeks, maybe months, later.

They primary issue that can sabotage any relationship, whether personal or professional―is a lack of communication. Now sprinkle collaboration into the mix, and you have DevOps. It broke down communication and collaboration walls that still exist – if DevOps isn’t being utilized – between the two (2). The result? Faster time to market.

Off-Shoot of Agile Development

DevOps, which has been around for approximately ten (10) years, was borne out of Agile Development, created roughly ten (10) years prior to that. Agile Development is, simply, an approach to software development. Agile, as the name suggests, delivers the final project with more speed, or agility. It breaks down software development into smaller, more manageable chunks, and solicits feedback throughout the development process. As a result, application development became far more flexible and capable of responding to needs and changes much faster.

While many use Agile and DevOps interchangeably, they’re not the same

While Agile provides tremendous benefits as it relates to software development, it stops short of what DevOps provides. While DevOps can certainly utilize Agile methodologies, it doesn’t drop off the finished product, then quickly move on to the next one. Agile is a little like getting a custom-made device that solves some type of problem; DevOps will make the device, as well, but will also install it in the safest and most effective manner. In short, Agile is about developing applications―DevOps both develops and deploys it.

How does DevOps address Time to Market?

Prior to DevOps and Agile, application developers would deliver their release to operations, which would be responsible for testing the resultant software. And when testing isn’t conducted throughout the development process, operations is left with a very large application, often littered with issues and errors. Hundreds of thousands of lines of code that access multiple databases, networks and interfaces can require a tremendous amount of man hours to test, which in turn takes those man hours off other pressing projects―inefficient, wasteful. And often there was no single person or entity responsible for overseeing the entire project, and each department may have different success metrics. Going back to the relationship analogy, poor communication and collaboration means frustration and dissatisfaction for all parties involved. And with troubled relationships comes finger-pointing.

Automation

One of key elements of DevOps is its use of automation, which helps to deliver faster, more reliable deployments. Through the use of automation testing tools currently available, like Selenium, Test Studio and TestNG, to name a few, test cases can be constructed, then run while the application is being built. This reduces testing times exponentially and helps ensure each of the processes and features have been developed error free.

Automation is utilized for more than just testing, however. Workflows in development and deployment can be automated, enhancing collaboration and communication and, of course, shortening the delivery process. Production-ready environments that have already been tested can be continuously delivered. Real-time reporting can provide a window into any changes, or defects, that have taken place. And automated processes mean fewer mistakes due to human error.

Questions about what DevOps can deliver to your organization?

While DevOps isn’t a product, it’s certainly an integral component to consider when evaluating a Managed Services Provider (MSP). GDT’s DevOps professionals have time and again helped to provide and deploy customer solutions that have helped shorten their time to market and more rapidly enjoy positive business outcomes. For more information about DevOps and the many benefits it can provide to organizations of all sizes, contact GDT’s talented, tenured solutions architects at SolutionsArchitects@gdt.com. They’d love to hear from you.

How do you secure a Cloud?

By Richard Arneson

Every organization has, has plans to, or wants to move to The Cloud. And by 2020, most will be there. According to a recent survey, within two (2) years 83% of enterprise workloads will be in The Cloud―41% on public Clouds, like AWS and Microsoft Azure, 20% will be private-Cloud based, and 22% as part of a hybrid architecture. With the amount of traffic currently accessing The Cloud, and considering the aforementioned survey figures, security will continue to be at the forefront of IT departments’ collective minds―as well it should.

With organizations selectively determining what will run in The Cloud, security can prove challenging. Now throw in DevOps’ ability to build and test Cloud apps easier and faster, and you’ve amped those Cloud security concerns significantly.

Security Solutions geared for The Cloud

To address the spate of Cloud-related security concerns, Cisco built an extensive portfolio of solutions, listed below, to secure customers’ Cloud environments, whether public, private, or a combination of both (hybrid).

Cisco Cloudlock

The Cloudlock DLP (Data Loss Prevention) technology doesn’t rest; it continuously monitors Cloud environments to detect sensitive information, then protect it. Cloudlock controls Cloud apps that connect to customers’ networks, enforces data security, provides risk profiles and enforces security policies.

Cisco Email Security

Cisco Email Security protects Cloud-hosted email, protecting organizations from threats and phishing attacks in the GSuite and in Office 365.

Cisco Stealthwatch Cloud

Stealthwatch Cloud detects abnormal behavior and threats, then quickly quells it before it evolves into a disastrous breach.

Cisco Umbrella

Cisco Umbrella provides user protection regardless of the type, or location, of Internet access. It utilizes deep threat intelligence to provide a safety net—OK, an umbrella—for users by preventing them access to malicious, online destinations, and thwarts any suspect callback activities.

Cisco SaaS Cloud Security

If users are off-network, anti-virus software is often the only protection available. Cisco’s AMP (Advanced Malware Protection) for Endpoints prevents threats at their point of entry, and continuously tracks each and every file that accesses those endpoints. AMP can uncover the most advanced of threats, including ransomware and file-less malware.

Cisco Hybrid Cloud Workload Protection

Cisco Tetration, which is their proprietary analytics system, provides workload protection for MultiCloud environments and data centers. It uses zero-trust segmentation, which enables users to quickly identify security threats and reduce their attack surface (all endpoints where threats can gain entry). It supports on-prem and public Cloud workloads, and is infrastructure-agnostic.

Cisco’s Next-Gen Cloud Firewalls

Cisco’s VPN capabilities and virtual Next-Gen Firewalls provide flexible deployment options, so protection can be administered exactly where and when it’s needed, whether on-prem or in the Cloud.

For more information…

With the help of its state-of-the-art Security Operations Center (SOC), GDT’s team of security professionals and analysts have been securing the networks of some of the most noteworthy enterprises and service providers in the world. They’re highly experienced at implementing, managing and monitoring Cisco security solutions. You can reach them at SOC@gdt.com. They’d love to hear from you.