Month: February 2021


first_imgIDC estimates that in two short years, flash-optimized arrays will be 58% of the overall external enterprise storage systems market. The agile data center requires the increased scalability and performance that flash provides.  That is why all-flash storage is a key focus for EMC, VCE, and our customers.With continued innovation, VCE expands its product portfolio with the introduction of VMAX All Flash storage arrays in Vblock and VxBlock 740 systems. By providing additional options to customers that are customized to meet evolving business needs and market trends, VCE and EMC can accelerate and drive the shift towards a modern data center.What is VMAX All Flash?VMAX All Flash is the first-to-market all-flash array to deliver performance, scale, high availability, and advanced data services for all mission-critical applications.  The new array uses high capacity drives to offer high IO density for extreme levels of performance with low latency. And with simple appliance packaging, it is incredibly easy to scale-up and scale-out at a granular level to fit changing requirements.VMAX All Flash configurations are ideal for customers who require multi-dimensional scale and consolidation, business continuity and disaster recovery for high levels of availability, and mission-critical storage requirements.What does this mean for VCE Customers?VCE continues to provide additional options for customers who want to move towards an all-flash, modern data center. Before the launch of VMAX All Flash, VCE customers had a choice between three offerings of hybrid storage VMAX3 in their V(x)Block 740 – 100k, 200k, and 400k. Now, customers can select all-flash configurations with VMAX 450 F/FX and VMAX 850 F/FX.For V(x)Block customers, the new VMAX All Flash arrays mean greater performance and lower total cost of ownership with fewer drive replacements, less energy, and smaller data center footprints – all delivered in a single all-flash bay. The TCO proposition of VMAX All Flash arrays is extremely compelling. An 800TB configuration with Vblock used to require up to nine bays of traditional VMAX arrays.  With VMAX All Flash, customers can reduce that footprint down to a single bay, lowering TCO by 40%.  In addition, VMAX All Flash will be orderable as a VCE Technology Extension to existing Vblock and VxBlock Systems and VCE Vscale™ Architecture implementations, providing customers with a path to significantly extend the performance and capabilities of their current VCE-enabled data centers.Finally, with the recent launch of EMC’s first rack-scale flash solution DSSD D5, VCE can continue to innovate and expand its product portfolio by incorporating this dense, shared storage solution into hyper-converged offerings. DSSD D5 is truly revolutionary. It is the ideal solution for data-intensive applications, providing unparalleled performance and flexibility to deliver insight faster and move the business forward.Where can I find out more?Read VCE President Chad Sakac’s blog on VMAX All Flash and why customers are moving towards an all-flash datacenter with VMAX as a building block on VCE converged platforms. In addition, read Chad’s take on DSSD D5 and what it means for future VCE hyper-converged products. Visit vce.com and emc.com for more information.last_img read more


first_imgPure Nonsense – Separating Fact from Fiction in FlashThe buzz about NVMe has recently reached a fever pitch, and rightfully so. It’s the perfect technology buzz word – a nice little acronym that blends the right amount of nerdy and cool and promises to change the landscape of storage forever. Who wouldn’t want to talk about that? But as we’ve discussed before, it is important to understand that NVMe is more than simply putting a fancy new interface on the same old flash drives.As one of the pioneers of NVMe, Dell EMC couldn’t be happier that the world is building with anticipation that NVMe is finally coming to fruition in mainstream enterprise storage.   When I say pioneer, I mean that literally – Dell EMC co-developed the NVMe standard starting back in 2007 and by the end of the year, we’ll release our next NVMe array, which will be the first of many in our mainstream all flash portfolio.You may have heard Pure Storage talking about their new “100% NVMe” array, but before you get sucked in, let’s level set on a few things:NVMe is an interface, which replaces SAS/SATA to overcome the limitations of these protocols that were designed over 30 years ago for spinning hard disk drives, and thereby take advantage of the parallelism of modern CPUs and SSDs.NVMe is NOT media, which in the short term is NAND-based flash, and is the same media that vendors have been shipping in all-flash arrays for years.NVMe will enable a modest improvement in latency but at a premium price.NVMe will, in the future, open up high-speed, low-latency access to the next generation storage media called Storage Class Memory, or SCM, (for example, 3D XPoint). And that’s where things get interesting. SCM will dramatically improve latency over NAND-based flash media, just as NAND-based flash dramatically improved latency over spinning disk media.In other words, SCM is Emerald City and NVMe is the yellow brick road – the game-changer in storage will be the combination of NVMe and SCM – not NVMe alone.So what about that snazzy new “100% NVMe” array from Pure Storage?  They built their own proprietary “NVMe drives” (dubbed “Flash Modules”) for this new array, which was expensive, and customers are going to have to pay for that.  To date, there have been no performance benchmarks published for the new FlashArray //X (also of note – existing performance metrics for Pure Storage’s FlashArray//M have been removed from their website as well).  Which makes you wonder…What benefits does the FlashArray //X have beyond unspecified performance over their last generation array?What performance will you really get from it, and does it justify the premium?Given their proprietary approach, what is the go-forward path to SCM and when will that come?The proprietary road is littered with failure.  See Violin Memory as a recent example.Dell EMC’s Strategy for NVMe in StorageWe recently evolved our NVMe strategy based on customer feedback and are now focused on continuing to deliver NVMe across our portfolio over the next 6 to 24 months.  In fact, we’ll launch our first “mainstream” NVMe array by the end of the year.  Unlike Pure Storage, we are leveraging industry-standard technology and collaborating with industry leaders like Intel to ensure our offerings are ready for enterprise requirements, while also minimizing price premiums for early adopters.  In addition, this approach allows us to accelerate and optimize delivery of the ideal enterprise-ready Storage Class Memory (SCM) devices as soon as they become available.Dell EMC also continues to lead the industry in driving standards in NVMe with our cutting-edge portfolio of PowerEdge servers.  We will offer NVMe in a variety of consumption models for storage including arrays, software-defined storage, converged and hyper-converged infrastructure solutions.  As always, they will be backed by Dell EMC’s world-class engineering, services and support organizations – who have been designing and testing solutions for mission critical environments for decades.last_img read more


first_imgIt all starts with data analytics.  The practice of applying modern analytics software tools across data of all types, including unstructured, semi-structured, and structured data; as well as real-time/streaming and batch.  Discovering insights to enhance the understanding of business and customer behavior is the primary goal.  These analytics-driven insights can be used to shape business outcomes, improve competitive advantage, enhance financial decisions and develop more concise projections.I sat down with Erin Banks (@BanksEK) aka #BigDataBanks at the Dell EMC Forum Montreal to get the latest.  From pizza to elevators to Mexican food and grocery stores, Big Data is nothing without Big Data Analytics.  We have the details this week.   For more information visit: dellemc.com/bigdata or Email: [email protected] The Source app in the Apple App Store or Google Play, and Subscribe to the podcast: iTunes, Stitcher Radio or Google Play.Dell EMC The Source Podcast is hosted by Sam Marraccini (@SamMarraccini)last_img read more


first_imgIt is time to break the misconception that CIOs and CFOs have competing priorities to spend and save money. Every executive is tasked with doing the right thing for the whole company, not just for his or her function. That said, IT has traditionally been accounted for as a General and Administrative cost, so rather than being perceived a strategic partner, IT is deemed a big cost center and a lever to pull to reduce costs.In today’s hypercompetitive global marketplace, creating and introducing a new product faster and closing a deal and collecting cash faster is not just an advantage, it’s a necessity. IT holds the keys to make this happen, but we can’t do that alone. We need to work closer with our CFO and business partners to reduce, reallocate and reinvest in modernizing and transforming IT for the future.Recognizing that the relationship between the CIO and CFO is also transforming, Dell EMC and Forbes Insights recently published a study titled, IT TRANSFORMATION: Success Hinges on CIO/CFO Collaboration. After reading the study, Dell’s CFO Tom Sweet and I discussed the findings in this video, but I also wanted to share my perspective as Dell and VMware’s CIO.First, while nearly all respondents agreed that CIO and CFO collaboration is critical, 89 percent said there are significant barriers including outdated reporting structures, incentives and attitudes. Well, the good news is that both CIOs and CFOs are under immense pressure to do what’s best for the company and to maintain or gain a competitive edge. If we don’t, we will trail our competition or worse be disrupted by an upstart hungry to eat our lunch. How’s that for a unifying incentive? On the plus side, collaborating more effectively with the CFO enables us to better understand the company’s goals and financial pressures to strategically advise and give our business partners the levers needed to make informed decisions that propel us forward. So, it is a win-win.Second, according to the study, it is becoming even more important for CIOs to have managerial and business skills, with 74 percent of respondents adding that CIOs should have an entrepreneurial mindset. For more than 20 years, I have made it my mission to aggressively and creatively deliver the innovation, performance and scale the business has demanded despite having flat or shrinking budgets. However, our continual focus on keeping the lights on and hitting tighter financial targets distracts us from being entrepreneurial. Not only do we need to find ways to automate and run leaner so we can better focus on innovating IT, but we need to show the CFO our progress doing so. This builds trust and enhances our ability to sell our strategy and justify investments with the CFO and other leaders to truly modernize and transform IT.And finally, the study shines a light on our need to drive a cultural change between CIOs and CFOs by designating change agents on both sides of the aisle and incentivizing them to collaborate. However, this alignment doesn’t stop at the CFO level. Rather than reacting to requests, IT professionals need to closely partner with all our business leaders to truly understand their digital transformation strategy and share the responsibility for ensuring the success of their business initiatives. By embracing the Dell Digital Way that leverages Pivotal technology and paired programing, an agile methodology, Pivotal Cloud Foundry and Spring-based tools, we are digitally transforming how our team does business. In addition to enabling us to rapidly deliver a much better end product for the business, this also builds confidence with the CFO and our business partners.The bottom line, we are at a crossroads. We can continue down the traditional, frequently traveled road and risk being disrupted. Or we can transform IT to work closer with our CFO and the business to not only address the financial pressures all companies face, but modernize IT to advance our digital transformation strategy and ultimately help Dell win big. Personally, I refuse to be disrupted, so the road less traveled looks much brighter to me.last_img read more


first_imgIt’s difficult to imagine a world where you couldn’t order groceries, check your bank account, read the news, listen to music or watch your favorite show from the comfort of your smartphone. Perhaps even harder to fathom is that some of these services only hit the mainstream in the last 10-20 years. The world we live in today is a stark contrast to the world of 20 years ago. Banks no longer deliver value by holding gold bullion in vaults, but by providing fast, secure, frictionless online trading. Retailers no longer retain customers by having a store in every town, but by bringing superior customer service with extensive choice and a slick tailored user experience. Video rental shops are a thing of the past, replaced by addictive convenient media streaming services such as Netflix. The list goes on.Delight, Engage, Anticipate, RespondAt the beating heart of this digital convenience is software. A successful software organization can delight customers with superior user experience, can engage its market to determine demand, and can anticipate change – such as regulatory. It also has the means to respond quickly to any risk: security threats, economic flux or competitive threat. In summary, companies are rediscovering their competitive advantage through software and data.How to build good software? Contrary to popular belief, it’s more than just spinning up some microservices. Good software relies on several core pillars being present: abiding by lean management principles; harmonizing Dev and Ops to foster a DevOps culture; employing continuous delivery practices (such as fast iterations, small teams and version control); building software using modern architectures such as microservices; and last but not least, utilizing cloud operating models. Each year, the highly regarded State Of DevOps Report sees continued evidence that delivering software quickly, reliably and safely – based on the pillars mentioned above – contributes to organizational performance (profitability, productivity and customer satisfaction).Per the title, this blog series intends to focus on the cloud pillar. In the context of software innovation, cloud not only provides the Enterprise with agility, elasticity and on-demand self-service but also – if done right — the potential for cost optimization. Cost optimization is paramount to unlocking continued investment in innovation, and when it comes to cloud design, there should be no doubt: architecture matters.Application FirstHow should an organisation define its cloud strategy? Public cloud? Private cloud? Multi-cloud? I’d argue instead for an application first strategy. Applications are an organization’s lifeblood, and an application first strategy is a more practical approach that will direct applications to their most optimal destination. This will then naturally shape any cloud choice. An app first strategy looks to classify applications and map out their lifecycle, allowing organisations to place applications on optimal landing zones – across private, public and edge resources – based on the application’s business value, specific characteristics and any relevant organizational factors.Ultimately seeking affirmation whether to invest in, tolerate or decommission an application, companies can use application classification methodologies to categorize applications. Such categorization determines where (if any) change needs to happen. Change can happen at these three layers:Application LayerPlatform LayerInfrastructure LayerThe most substantial lift, but one with the potential for the most business value, is a change to application code itself, ranging from a complete re-write, to materially altering of code, to the optimization of existing code. For applications which don’t merit source code change, perhaps the value lies in evolving the platform layer. This re-platforming to a new runtime (from virtualized to containerized for example) can unlock efficiencies not possible on the incumbent platform. In the case of applications where the transformation of application code or platform layer may want to be avoided at all costs, modernization of the infrastructure layer could make the most sense, reducing risk, complexity, and TCO. Lastly, the decommissioning of applications at the end of their natural lifecycle is very much a critical piece of this jigsaw. After all, if nothing is ever decommissioned, no savings are made. This combination of re-platforming applications, modernizing infrastructure and decommissioning applications is crucial in freeing up investment for software innovation.Landing ZonesWhere an application ultimately lands depends on its own unique characteristics and any relevant organisational factors. Characteristics include the application’s performance profile, security needs, compliance requirements and any specific dependencies to other services. These diverse requirements across an Enterprise’s application estate, give rise to the concept of multi-cloud Landing Zones across Private, Public and Edge locations.Cloud ChaosDue to this need for landing zones, the industry has begun to standardize on a multi-cloud approach — and rightly so. Every application is different, and a multi-cloud model permits access to best-of-breed services across all clouds. Unfortunately, the multi-cloud approach does bring with it a myriad of challenges. For example, public clouds deliver native services in proprietary formats, often necessitating the need for costly and sometimes unnecessary re-platforming. The need to re-skill a workforce compounds these challenges, as do the complex financials created by a multi-cloud model due to inconsistent SLA constructs across different providers. Lack of workload portability is another critical concern due to the previously mentioned proprietary format. This is further exacerbated by proprietary security and networking stacks, often resulting in lock-in and increased costs.Cloud Without Chaos Dell Technologies Cloud is not a single public cloud, rather a hybrid cloud framework which delivers consistent infrastructure and consistent operations, regardless of location. It is unique in the industry with its capability of running both VMware VMs and next-generation container-based applications consistently, irrespective of whether the location is private, public or edge. This consistent experience is central to enabling workload mobility, which itself is key to flexibility, agility and avoidance of lock-in.Through combined Dell Technologies and VMware innovation, core services such as hypervisor, developer stacks, data protection, networking and security, are consistent across private, public and edge locations. Dell Technologies Cloud reduces the need for complicated and costly re-platforming activities associated with migration to a new cloud provider’s native proprietary services. Nonetheless, for organisations wishing to leverage native public cloud services, they can still do so while also benefiting from proximity to VMware-related public cloud services.Consistent operations also reduce the strain on precious talent, by allowing companies to capitalize on existing skillsets. Organizations can consistently manage applications –  regardless of location –  and avoid the costly financial implications of re-skilling staff each time they choose a new cloud provider.Looking at this from the lens of the developer and with modern applications in mind, thankfully, container standards span the industry, which mitigates the need for wholesale container format changes between clouds. Despite this, each cloud provider has an opinion on the ecosystem (container networking, container security, logging, service mesh, etc.) around containers in their native offerings, such as CaaS and PaaS. This bias can precipitate the need for tweaks and edits each time an application is to be moved to another cloud, effectively burning developer cycles. Instead, an organization can maximize developer productivity by employing turnkey, cloud-agnostic developer solutions, which are operationalized and ready for the Enterprise.  The developer can write their application once and run it anywhere, without tweaks or edits required to suit a new cloud provider’s stack.At the other end of the application scale, most Enterprise organizations own a significant portion of non-cloud-ready, non-virtualized workloads such as bare-metal and unstructured data. Through Dell Technologies extensive portfolio, these workloads are fully supported on various platforms and never considered an afterthought.Likewise, a critical element of any organization’s cloud investment is its strategy around cloud data protection. Dell Technologies Data Protection Solutions covers all hybrid cloud protection use cases from in-cloud backup, backup between clouds, long-term retention to the cloud, DR to cloud and cloud-native protection.Increased Agility, Improved Economics & Reduced Risk Ultimately, Dell Technologies Cloud delivers increased business agility through self-service, automation and the unique proposition of true portability across private, public and edge locations. This agile, flexible and resilient foundation can enable Enterprise organizations to accelerate software innovation and in turn, quicken time-to-market.In addition to the business agility and workload mobility gained from this consistent hybrid cloud model, companies can also improve cloud economics, as well as leverage multiple consumption options, irrespective of cloud location. Any modernization offers mitigation of business risk through  eliminating technical debt, minimizing operational complexities and bypassing unknown and inconsistent future financials.last_img read more


first_imgWASHINGTON (AP) — President Joe Biden appears to be boosting his goal for coronavirus vaccinations in his first 100 days in office, suggesting the nation could soon be vaccinating 1.5 million Americans on average per day. Biden made the comments Monday as talks with Congress over a $1.9 trillion stimulus package showed few signs of progress. He signaled his increasing bullishness on the pace of vaccinations after signing an executive order to boost government purchases from U.S. manufacturers. It was among a flurry of moves by Biden during his first full week to publicly show he’s taking swift action to heal an ailing economy.last_img read more

Recent Comments