For the last few years, IT has become good at deploying cloud infrastructures. We are successfully standing up clouds on premises, off premises and even creating hybrid models. Congratulations – we now have clouds.The problem is once something exists, expectations shift. Our businesses and our customers assume that the infrastructure is in place, so the emphasis is now on using clouds to add value.Our businesses are looking to IT to deliver business outcomes: predictively spot new business opportunities, deliver hyper personalized experiences to customers, and innovate at a pace that we’ve never done before. It is expected that IT service deployments will be completed in minutes, hours, or days as opposed to months or years. We are expected to have tools operating in real time while remaining trustworthy, secure, transparent, and in compliance. There is another key reason why IT has to shift from simply building clouds to actually exploiting them. Every one of our businesses is threatened by companies who are more adept at using modern IT technologies. Companies like Uber, Tesla, and Amazon are good examples of aggressively adopting a business model where new technologies are used to disrupt an industry.In every industry today, there is an equivalent to one of these companies. Some of you may be that equivalent.The way to compete in an environment where these disruptors are present is to use cloud technology and the new overall IT stack to deliver more complex, intelligent services.We’re starting to see this materialize in an entirely different IT stack often running on the same clouds as the existing mission-critical apps that support existing business processes. However, this new stack is built in the image of the new disruptive web scale companies.This new IT stack based on data lakes uses new data fabrics that fundamentally allow aggregation of all structured and unstructured data from both internal and external sources.On top of these data fabrics, we are seeing new application frameworks materialize. Cloud Foundry is a good example of a modern framework that is designed to efficiently compose a new set of apps with microservices supported by modern agile software development practices.Programming languages are also changing. If you are going to develop a new app to serve your business and you’ve decided to run it in a microservices model with a containerized architecture, the odds are incredibly high that you will be developing it in Go or Node.JS or another new open source language.And most importantly in this stack, the operating model is radically different. Traditional “test/dev” has been replaced by “dev/ops” where developers continuously create, instantiate, deploy, and even scale applications as a self-service operation.To thrive in this new IT environment, we all need to be using clouds in two ways:To keep improving our IT environments as they exist today – eliminating time and cost, andAs a place where we build a new parallel IT stack that enables us to create new software apps, delivering new products and services to disrupt our industry.The exciting news for IT professionals is this transformation is happening. IT will be more important than ever. The bad news? We’re not going to get much sleep for the next 10 years.This post is based on John’s CiscoLive! keynote, “The Evolution of the IT Career.” A recording of the presentation is available here (registration required).
Data is king.In a world reshaped by digital transformation, data has become an integral piece of the decision-making process within an organization. All business models today are built around data, enabling leaders to make big decisions to increase revenue, decrease cost, and reduce risk. However, too often organizations view the protection of that data as an overhead or an insurance policy, rarely ever looking at the data they’ve actually stored. That data has a lot of value in it that if accessed…could be a game changer when it comes to business insights and intelligence.Organizations need to start thinking about data management by considering the following questions:How accessible is your data?How likely is it that your data will survive multiple disasters?What meta-data do you need?How important is the privacy and security of your data?And of course, there are industry trends that will shape the way you think data management and protection.With Artificial Intelligence and Deep Learning, risk and responsibility lead to big outcomes.Over 70 percent of enterprise companies are expected to leverage AI by the year 2021. Innovation in specialized hardware is accelerating deep learning capabilities in the data center, translating into business insights that drive growth through historical data that can now be derived from machine learning algorithms.Building AI models and instantiating them is a vital investment – which of course, comes with risk. That data is incredibly valuable. The reality is that at some point your valuable data may be lost, damaged, corrupted, or compromised – the equivalent of taking 500 of the smartest people at your company and having them disappear.Data protection, and the accessibility of that data, has become significantly more important (than ever) for the future of the data center – and everywhere it lives – including the cloud and at the edge. This is more than just an insurance policy – this is critical in ensuring your most valuable data assets are there when you need them to drive growth and a competitive market position.Making your data protection intelligent is complicated.Data protection today is a very complex task. There are a lot of manual configurations to consider, a need for specialized administrators, and an intense process for accessing historical data.Fast forward into the future: we would like to have a fully autonomous system that isn’t just storing data arbitrarily, but using compute capacity to gain insights and develop a model to create the brain for your business. This would instantiate data through which the algorithms, results, trends, and trained models could extract the most value. Think about what this could do!When you consider data protection – protecting and trusting where your data lands – it’s not just about making sure it is resilient. It’s about making sure that all that effort you expended to develop an AI learning system over an extended period of time – the code and the data – lands somewhere that it is actually protected. The system must be intelligent enough to know on which location to store the data, what meta-data you need to produce the data, and be able to predict possible data loss events and prepare for them in advance. And intelligence drives simplicity.New storage technologies give data protection a boost. Most protection data is kept on spindles. As new media storage is becoming cheaper, high capacity QLC flash devices continue to expand in capacity without increasing the price. Arriving to the market, these systems are able to store secondary data (i.e. the backup data) on a much faster storage device and still gain the price and efficiency of a data protection system.The future protection of storage will be integrated into an AI system and thus insights of past data will be available.Being able to gain insights directly from the secondary protection system will also allow reducing some of the load and capacity on the primary storage arrays, allowing a wider adoption of new storage technologies like storage class memory (SCM) for primary storage.It’s an AI, new application, multi-cloud, and IoT must have! As data becomes critical to the organization, a simplified, coherent data protection plan is required for new application development mechanisms too. Not only that, but more and more enterprises are moving to use hybrid cloud and multi-cloud strategies, meaning data not only resides on premise or on a single cloud, but rather on multiple different clouds. Over 20 percent of the enterprises believe that they will use more than five separate clouds in the future. This makes management protection of the data an even more complex task. But it also opens opportunities. In addition, the IoT devices that generate huge amounts of data pose a challenge for proper data management and protection strategies. I plan to explore all of this and more in future posts.As we can see, the value of business data paired with the reality of artificial intelligence has created a new world order. Our research shows that data protection initiatives were one of the most common initiatives undertaken by companies that are looking to transform and modernize their IT. The outcome? A robust, current environment adapted to keep up with the likes of the data generated by artificially intelligent infrastructure. Data protection is not merely an insurance policy. It’s a must-have to make big decisions and insights in order to stay competitive in this digital landscape.
I know what you may be thinking – that I’ve seen this Dell EMC Unity blog before. I assure you, it just looks that way as we provide you with latest updates to Dell EMC Midrange Storage. But this time, in addition to just product news, there’s an extra special message surrounding this blog post that I’d like to share with you.Customers, partners and industry voices have weighed in to name the Dell EMC Unity 650F All-Flash Array the CRN 2018 Product of the Year award as the Midrange Storage subcategory winner for Technology. This product award is testament to the best-in-breed technical innovation, reliability and quality Dell EMC continues to deliver across the Midrange storage portfolio and affirms our promise to continue to innovate and invest in our Midrange storage offerings. In short, we continue to make Dell EMC Unity better – and our channel partners and customers around the world are agreeing.And so today, we’re announcing that we’ve made Dell EMC Unity even better with the newest Operating Environment 4.5 release including expanded software features such as advanced data reduction, data protection, and management functions. And for our customers who are already invested in the cloud or those just starting out, Dell EMC Unity also expands its cloud presence with more cloud deployment options. Customers will also benefit from the quality advancements that come with every Dell EMC Unity release.“We continue to be impressed with the enhancements that Dell EMC has made with the Unity platform. The new OE 4.5 release that includes advanced data reduction technology has allowed Arrow ECS to more efficiently manage our data. Using the new File Level Retention and quota management software has enabled us to improve the management of storage allocation and regulatory compliance requirements while Metrosync Manager will help our business reduce unplanned downtime by allowing file synchronous storage resources to failover to our destination site in the event of a disaster”.Seife Teklu, Senior Solutions Architect, Arrow ECS Let’s look at what’s new with Dell EMC Unity OE 4.5:Advanced Inline Deduplication We’ve included Advanced In-line Dynamic Pattern Detection that considers all data patterns enabling you to see increased efficiency over OE 4.4 with up to 2.7:1 data reduction savings applicable to both file and block data. New software to prevent file data loss New file-level retention capabilities will protect files from modification or deletion until a specified retention date. This is especially useful, for example, when companies are bound by new regulations and compliance requirements for data that has extended life cycles and require longer maintenance and retention periodsSoftware-defined storage with HAWith Dell EMC Unity OE 4.5, we’ve added native high availability (HA) for our software-defined Dell EMC UnityVSA Professional Edition with 2-node, 2-core, and Tie Breaker Node for 10, 25, and 50TB capacity offerings. The Dell EMC UnityVSA Tie Breaker Node or witness node is a lightweight third member of the cluster to arbitrate in the unlikely event all communication is lost between the two nodes to avoid data corruption. And don’t forget that you can always download and try Dell EMC UnityVSA Community Edition for free.But there’s more! Not directly associated with the OE 4.5 release, we’ve also invested in advancing the ability of Dell EMC Unity to participate in hybrid cloud environments – and it’s all available now.Dell EMC UnityVSA Cloud Edition runs in AWS cloudWith our new Dell EMC Unity Cloud Edition, you’ll be able to deploy fully-functional Dell EMC Unity storage as a VM in a VMware Cloud environment with initial qualification for VMware Cloud on AWS. In this version, you’ll be able to deploy: File management gets even betterMetrosync for Unity synchronous file replication that came with OE 4.4 is now enhanced with the addition of Metrosync Manager. Metrosync Manager will enhance orchestration, replication granularity and failover capabilities for your synchronous file replication. Comprehensive UFS64 file services for VM CloudCloud-based DR capabilities with native asynchronous replication; andScalable test and development environments without additional Dell EMC Unity hardware.Dell EMC Unity validated with VMware Cloud FoundationDell EMC Unity and VMware Cloud Foundation NFS qualification allows you to implement “Do It Yourself” Cloud Building Blocks to design and build a custom cloud platform using best-of-breed cloud-enabled infrastructure such as Dell EMC Unity. Dell EMC Unity is the first NFS-based (i.e. File) external storage array family to be validated with VMware Cloud Foundation. It further demonstrates tangible benefits of buying the complete Dell Technologies stack and highlights our Dell EMC Unity engineering investments in VMware.Finally, we’re always enthusiastic when third parties seek us out to review our products. In this case, here’s a StorageReview paper on Dell EMC Unity that provides a 360 degree view of the innovation we deliver – management, performance, architecture and software. I encourage you to read the review – it’s compelling.As always, thank you for your business.
KENOSHA, Wis. (AP) — Authorities say two police officers who were on the scene when a white officer shot and partially paralyzed a Black man in Wisconsin have returned to duty. The update announced Wednesday comes as Officer Rusten Sheskey, who shot Jacob Blake seven times on Aug. 23 in Kenosha, remains on administrative leave while a police review board examines the case. Sheskey was placed on administrative leave following Blake’s shooting along with Officers Vincent Arenas and Brittany Meronek. Police said in a statement Wednesday that Arenas and Meronek returned to duty Jan. 20. Hundreds of people were arrested and multiple businesses were destroyed during protests following Blake’s shooting. Kenosha County District Attorney Michael Graveley this month declined to file charges against Sheskey.
DES PLAINES, Ill. (AP) — Fire officials say a space heater may have caused a blaze in suburban Chicago that killed four young girls and their mom. The Des Plaines Fire Department says the space heater was at the top of stairs tat were the only way in and out of the second-floor unit. That’s where all of the family members were found. The space heater’s placement may have blocked the family’s only path out of the burning building. No smoke detectors were on the second floor. Foul play is not suspected. The fire killed a 25-year-old woman and her four daughters, ages 1 to 6. The Des Plaines Fire Department is continuing its investigation.