From enabling precision medicine to accelerating the development of new pharmaceuticals, high-performance computing is helping us take healthcare and life sciences to a new level.
In today’s digitally driven world, healthcare and life sciences are increasingly dependent on advanced technology. And in this new era, the business of saving lives can require an enormous amount of computational power.
In clinical settings, for example, high-performance computing (HPC) clusters are enabling precision medicine that allows physicians to tailor treatments to the unique needs of individual patients. Whether the caregiver is using genetic sequencing to discover gene mutations or running complex algorithms to enable narrowly targeted cancer treatments based on the patient’s genome, HPC is essential. None of this would be possible without the computational power of HPC systems, working in tandem with software applications, algorithms and lightning-fast storage and networking.
Consider this finding: It took 13 years and $3 billion to complete the first human whole genomic sequence (WGS) in 2000. Today, a whole genome can be sequenced for about $1,000 in as little as 22 minutes., The credit for much of this leap forward goes to advent of faster and more affordable HPC clusters.
As industry veterans Mahni Ghorashi and Gaurav Garg note in a Tech Crunch article, “Converting the raw data of the human genome into medically useful and understandable information has historically been a huge technical bottleneck, but over the course of the last decade, advances in compute, rather than laboratory processes, have driven the most dramatic time and cost reductions associated with WGS.”
Now consider the work being done by the Translational Genomics Research Institute (TGen). TGen’s Center for Rare Childhood Disorders is helping researchers discover gene mutations with a fast, powerful HPC cluster based on the Dell EMC Genomic Data Analysis Platform, the predecessor of today’s Dell EMC Ready Bundle for HPC Life Sciences. This cluster, powered by Dell EMC PowerEdge servers with Intel® Xeon® processors, runs extremely complex algorithms that analyze terabytes of genetic and molecular data at speeds unimaginable in the past.
TGen’s Center for Rare Childhood Disorders is confident that it can use its HPC cluster to facilitate new research going forward — and give new hope to patients and their families.
“We have the ability to more quickly perform genetic sequencing and meet the demand for processing increasing data volumes, because of the Dell EMC HPC cluster,” says James Lowey, TGen’s chief information officer. “And this technology also helps us ask harder questions of the data, and hopefully answer those questions. We are committed to helping children with rare disorders, and we are better equipped to do that with this solution.”
It’s not just precision medicine that is getting a lift from HPC. Pharmaceutical companies now use HPC systems routinely to develop new drugs and therapies designed to prevent and treat disease. HPC systems make it possible to analyze massive amounts of data and ask and answer ever-harder questions, all in the interest of bringing life-saving advances to market in less time.
On another life sciences front, the Beijing Genome Institute (BGI), a leading company in biological research, is working to bring down the cost of sequencing to make it more widely accessible, all while enhancing its own biological research capability. To do this, BGI’s strategic objective has been to continuously enhance the computing power of the HPC platform behind its sequencing activities.
To drive toward this objective, the institute brought in a Dell EMC PowerEdge FX modular infrastructure solution to increase the platform’s capacity. Since deployment of the Intel® Xeon® processor-powered Dell EMC PowerEdge FX unit, BGI has taken a leading position domestically. This has allowed BGI to develop the gene sequencing system BGISEQ-500, a one-button sequencing technology that supports DNA sampling and delivers analysis results in just 24 hours.
Ultimately, precision medicine helps physicians gain a closer understanding of each patient’s genetic makeup and specific requirements for treatment. These insights open the door to customized healthcare and new ways to prevent, diagnose and treat disease.
And it’s all powered by high-performance computing systems — which is another reason why HPC matters.
For a closer look at the work done by TGen’s Center for Rare Childhood Disorders, read the case study “Giving hope to children with rare disorders.”
 Edico Genome, “Edico Genome Offers Bundled Ultra-Rapid Solution for Clinical Genomics and Research,” Jan. 4, 2017.
 Dell EMC case study, “Using a modular architecture to understand the mysteries of life,” 2016.
|Update your feed preferences|
As workplaces have evolved, so have the workforces that use them. Several distinct worker personas have emerged, each with its own demands for specific hardware, software and services. We think it’s time your customers knew more about them.
By understanding these personas, your sales team can quickly identify the types of people your customers employ, what their needs are, and what technology is right for them. Dell EMC has identified two personas in the office: desk-centric workers and “corridor warriors”. Let’s check out what offerings from the Dell Technologies portfolio suit their working needs.
Desk-centric employees predominantly use a desktop PC because they perform specific roles and need a fixed environment for functional, security, or compliance reasons.
Our technologies provide a comprehensive desk-centric user experience. Take the Dell OptiPlex, for example. It’s a desk-based system using Microsoft Windows 10 Pro with a choice of form factors and mounting options to personalise a workspace. Alternatively, Dell Wyse terminals can provide the end-point for a complete thin client solution. Along with, Dell VDI Complete, which brings together the front- and back-end infrastructure to create a fully validated desktop-virtualization bundle, organizations can rapidly deploy desktop virtualization much quicker than ever before.
For a real-world example of how one of our customers supports its desk-centric users, read Meituan-Dianping’s case study. This Shanghai group-buying deals company turned to Dell EMC for hardware provisioning and saw a boost in IT productivity
Client devices are critical in helping corridor warriors receive the best possible experience. They’re rarely in one place for long. They pound the office floors, flitting from meetings to brainstorming sessions and using every available collaboration space.
To meet their needs, we recommend the Dell Latitude 7000 Series 2-in-1, a Windows 10 Pro notebook that doubles as a tablet – perfect for those who are on the move. In addition, ProDeploy Plus enables fast deployment with preconfigured collaboration software —such as Microsoft SharePoint and the cloud-based Office 365.
The widespread move of IT services to the cloud suits this persona perfectly. Our servers are built to work seamlessly with the Microsoft Azure stack so when these mobile users are working, they receive the same secure software environment continuously.
A real-life story of corridor warriors in action is The Austrian Institute of Technology (AIT), which employs over 1,100 researchers working across eight sites, and each with two workspaces on average. Read our case study to understand how the largest non-university research institute in Austria matched its workers with the right Dell EMC technology.
Technology has a huge potential to help organizations transform their workplaces, and by extension, transform their people’s working lives. We believe that approaching workers as personas is a critical part of workplace transformation, providing personalized products for how employees work today and in the future.
We’ll take care of the solutions, so you can take care of your customers.
Read the Desk-Centric Users and Corridor Warriors guides, as well as others, here.
We’ve also created related emails here, on our new Digital Marketing Platform so that your marketing teams can quickly get these guides into the hands of your customers. The guides explain how to maximize the productivity of their employees through the right choices from our end-to-end portfolio.
If you don’t have access to the Digital Marketing Platform, please register here.
|Update your feed preferences|
In the channel industry, CRN’s Channel Madness Tournament of Chiefs begins with 32 of the industry’s most influential Channel Chiefs. It ends with one. Following the same format and schedule as the much-anticipated collegiate tournament, each channel chief will compete for bragging rights during CRN’s Channel Madness.
CRN’s fourth annual Channel Madness Tournament of Chiefs pits some of the channel’s best-known executives against each other in head-to-head battles where CRN readers vote to determine the victors. 32 of the channel industry’s most influential channel executives compete for the single title of favorite channel chief. The winners will make their way, round by round, through a bracket, moving closer and closer to the championship match, where bragging rights are at stake.
Voting in Round 1 Is Live Now!
Please take a moment to vote for your favorite channel chiefs. Dell EMC’s Joyce Mullen, President, Global Channel, OEM and IoT is among the 32 Channel Chiefs chosen to participate in the 2018 CRN Channel Madness Tournament of Chiefs. She is joined by Dell Technologies executives Frank Rauch of VMware and Faraz Siraj of RSA.
HOW TO VOTE:
SPREAD THE WORD:
The winners of Round 1 will be announced on March 22 when Round 2 voting begins.
Good luck to all of the Channel Chiefs, and may the best chief win!
|Update your feed preferences|
Enterprises everywhere are increasingly adopting converged infrastructure (CI) as one of the best ways to rapidly adopt new technologies, reduce risk and simplify operations.
Customers using Dell EMC VxBlock Systems report significantly better business outcomes with lower costs, faster time-to-deploy, simpler life cycle management and more time to focus on new business initiatives.
With the first Dell EMC Converged Systems deployed in 2009, Dell EMC continues to collaborate with Cisco accelerating our customers’ ability to adopt new hardware and software even faster.
Cisco Live Barcelona was the perfect opportunity to launch the latest VxBlock converged system, VxBlock 1000. I was not able to attend, but via the magic of radio and a little help from Paul Young (@youngp2) it’s like I was there. Paul caught up with Jon Segal (@Jon_Siegal), VP Product Marketing Converged Platforms, Dave Hayward, and Tom O’Reilly (@Tom_Oreilly_x), CTO EMEA. For the latest on Dell EMC Converged Offerings, be sure to follow @DellEMC_CI.
Dell EMC The Source Podcast is hosted by Sam Marraccini (@SamMarraccini)
|Update your feed preferences|
The University of Arkansas needed to solve the same technological challenges large companies do, but had the ambition to do it in a way that pulled processing back from endpoints for security and manageability, while still serving up graphically robust, PC-like performance. Out-of-the-box solutions were coming up short when they sat down Dell EMC to devise an answer. The resulting VDI implementation project was honored with this year’s Tech Target’s Access Innovation Award as an exceptionally innovative and successful end-user computing project, based on four criteria: ease of use, innovation, functionality and performance, and value.
Director of VDI Ready Solutions at Dell EMC Andrew McDaniel sat down with Jon Kelly and Stephen Herzig and from the University of Arkansas’ IT Services Team recently to discuss the project and celebrate the award.
Andrew McDaniel (AM): When you came to Dell EMC with this VDI project, what challenges were you facing?
University of Arkansas (UA): We have 27,000 students, and we’re classified as a Research 1 University by the Carnegie Classification of Institutions of Higher Education, meaning our students engage in extensive research activity. As an IT team, we have an important responsibility to support student learning in a modern way that meets and exceeds the expectations and needs of our students and faculty. When Chris McCoy came on board as CIO, he wanted to leap forward technologically. He set eight technology projects as high priorities for the university to accomplish, and VDI was one of them.
Our challenge was that the campus had been BYOD for quite a while, and students could access some applications from their devices, but the applications that were more particular to a class or curriculum could only be accessed from a specific lab in a specific building. We wanted to level the playing field so students could have anytime, anywhere access to everything they needed for success on any device.
We then recognized that there was a trend toward GPU utilization within the VDI environment. Applications were being written with the assumption that GPU existed.
AM: What were the main objectives you needed to accomplish?
UA: We set the following parameters for our project:
AM: After researching vendors, how did you choose Dell EMC and the configuration you ultimately went with: Dell PowerEdge R730 servers, Dell Wyse thin clients, NVIDIA GRID software and the VMware Horizon client?
UA: The ability to deliver a high-quality GPU experience was central to our goals, so we were pleased to discover that NVIDIA’s GRID software for abstracting GPUs would enable us to get the VM density we needed. This had been missing from a previous VDI project we’d rolled out, and it led to a low-quality experience, so NVIDIA’s technology was a key component for us.
On the hardware front, we selected the Dell PowerEdge R730 because it supported two GPUs and 14 core processors to support the fast, crisp experience we wanted to provide. We implemented Dell Wyse thin clients as the access points throughout the campus, and our engineers were able to optimize our software to get login time down to 18 seconds.
The VMware Horizon client made our VDI environment accessible from any device so we can provide BYOD mobile delivery, while the use of hyper-converged appliances with vSAN will enable us to scale in the future.
(Editor’s note: Dell EMC has since formalized this combination of solutions as Dell EMC VDI Complete Solutions, delivering this total package as a service for as low as $7 per user per month and a single point of contact for support.)
AM: Let’s talk about implementation. How did that process go?
UA: VDI implementation is complicated because it touches on all aspects of IT. Dell EMC, VMware and NVIDIA came on site with us to understand the challenges we needed to solve, the varying needs of our different departments, and how they could best help us. Our IT team is strong and deep, so we chose to do much of the work ourselves, but Dell EMC and the other vendors helped and supported us through the process through a single point of contact, which was incredibly helpful.
AM: What about results? Has the project delivered against your goals and expectations?
UA: Yes, it has. We now have the ability to rapidly deploy application pools, allowing us to quickly and efficiently deliver applications to students. That was one of the high-priority challenges we solved with this project.
We have an on-campus game development and visualization studio we call “Tesseract” that is now on the path to delivering learning environments in game format through VDI. And our College of Architecture and Design is now able to centralize its applications and computing power so students can work in their design software on any platform or device.
Our IT team members responsible for maintaining and supporting student lab endpoints are seeing a reduction in resources required to support the labs now that they have VDI endpoints. Support resources are now free to work on more high impact projects and services for the campus.
AM: From your perspective, why do you think this project won the Access Innovation Award?
UA: Deploying VDI on our campus meant pulling together a diverse range of components into a cohesive infrastructure that delivered a high-quality, PC-like experience for students. Dell EMC VDI allowed us to deliver the results we wanted for our students and faculty in a way that was cost effective and easy to manage. The VDI effort also brought together IT resources from across campus, working together in new ways, on a common cutting-edge technology platform – and that was no small feat.
Dell EMC congratulates the University of Arkansas for winning the Access Innovation Award. If you are interested in learning more about how VDI Complete is making high-quality, speedy VDI deployments possible for institutions and organizations across the country, visit https://experience-vmware.com/vdicomplete/.
|Update your feed preferences|
Are you doing cool and great things with RSA Archer but are too modest to tell anyone about it? No need to keep it a secret any longer, come down to the RSA Archer Summit 2018, Aug. 15 -17 in Nashville and tell us all about it!
For the past fifteen years, the RSA Archer Summit has offered attendees a unique opportunity to learn from other RSA Archer customers and users new and innovative ways RSA Archer is being used in organizations to help achieve strategic business initiatives. The RSA Archer Summit a great venue to share ideas, learn from peers, and develop friendships and business relationships that can help you and your organization excel.
This will be my fifth RSA Archer Summit, and in that time I have spoken to hundreds of customers, all doing amazing things with RSA Archer. Many whom I have talked to would love to chance to talk about what they are doing with RSA Archer, but are not quite sure how to pull it together into a presentation or which track they should submit their submission to.
If this is you, don’t worry, submitting an idea for a topic, session or roundtable is easy to do! And if you are not quite sure if your idea for a session is a good one, just ask us. We can help you craft your idea into a great presentation or roundtable conversation that fellow Archer users will find helpful and informative. You just have to ask… We will help you make your session a success and walk you thru every step of the process. It’s easy, you just need to take the first step. Send us an email at RSAArcherSummit2018@rsa.com with your questions or ideas for a topic. We would be happy to work with you on developing your ideas.
But don’t wait too long to send in your questions or session topics, submissions are due by March 30th, 2018
We have three main topic areas for this yeas RSA Archer Summit to make it easier for you to select the one that best fits your session. Below is a brief overview of each topic area. If you need more information on each topic area, make sure and read Steve Schlarman’s blog “Call for Speakers”.
Business Risk Management
The RSA Archer Journey
RSA Archer Technical
If you are contemplating submitting a session, know that this is a very rewarding experience. And remember, we are here to help you, so send us your ideas, session and roundtable questions to RSAArcherSummit2018@rsa.com if you need some guidance. Presenting at RSA Summit is not as hard as you think and can be a very rewarding experience.
The Submission process is simple:
Selections will be communicated with speakers once the selection committee reviews all submissions.
|Update your feed preferences|
In today’s world, the power of data analytics is everywhere. From agriculture to healthcare, from shopping to dating, from the vehicles we drive to the way we do business, our experiences are increasingly shaped by data analytics. This is true even when it comes to whisky tasting, although in this case the analytics process is driven by our senses and our reasoning rather than sophisticated algorithms.
This is a topic that is close to my heart, given that I’m a director of data analytics who moonlights as a whiskey sommelier. I often have occasion to reflect on the amazing parallels between the principles of data analytics and the process of tasting whisky.
With that thought in mind, let’s look at 10 of the ways in which data analytics and whiskey tasting share common ground.
Let’s take a step back and see how we got here.
Back in the 1980’s, when data warehouse vendors like Teradata provided the ability to pool data, business owners asked even more demanding questions. Then SAS and SPSS, whose origin owes to government and academic interests, developed tools that allowed for “what will happen” questions and not just “what happened.” Fast forward to now. Fueled by math smarts and entrepreneurial spirit, we now expect Amazon to recommend books when we shop or Uber to send strangers with an empty seat to our address. Doubt what I’m talking about? Ask Siri or Alexa.
Back to whiskey. By 1954, the U.S. saw the number of distilleries collapse into four companies. Courage and curiosity brought back independent distillation following the rise of microbreweries in the 1970’s. Pioneer Tito Beveridge planted the craft distillery flag in Texas in 1997 after he observed the first seedlings in Kentucky and Tennessee. What followed was a wave of craft distilleries with shoots emerging in California, Texas, New York, Colorado and Washington. It’s now a multi-billion business with 27.4 percent growth.
Amplifying this trend, cocktails popularized by shows like Mad Men or House of Cards put whiskey in our collective consciousness. We should no longer expect only wine to come in flights either. We can skip our way through whiskeys too. We arrived at the new normal: whiskey tours, bourbon runs and the rise of The Whiskey Sommelier. Whiskey tastings now pop up like daisies in a sun-drenched field. Whiskey tasting is an active sport involving all five senses and your brain.
Now let’s loop back to data analytics and look at the threads that tie these two worlds together. In particular, let’s look at 10 reasons why data analytics and whiskey tasting share common ground.
When online retailers look for patterns in clickstream data, they engage a practice known as deductive reasoning. I see this and I see that. Therefore this other thing is highly correlated. Ask people at Walmart why they stock strawberry Pop-Tarts in the front of the store before a hurricane. They will tell you it was because they saw a pattern.
Same is true for experiencing whiskey. An active whiskey drinker analyzes what she experiences. Does she note a familiar herb like heather in all of the Speyside single malts? She is looking to establish a premise AFTER collecting data. She is not making a grand statement like “all Speyside single malts have heather” after tasting just one. Inferring generalities from specifics then running experiments to prove the theory is inductive reasoning.
Data analytics pursues feature detection to find what will predict an outcome. Features are like column headers in a spreadsheet. Lenders inspect aspects of home mortgage applications to see what attribute or combination of attributes will shine a light on those who are worthy. This process is not unlike whiskey tasting.
Consider tasting wheels. They are just circular spreadsheets. Each spirit can be ranked in intensity to those specific flavors. Those features originate from grain selection, fermentation, distillation, barrel type and aging. These are puzzle pieces that whiskey lovers adore assembling in their minds. They are getting to know the whiskey like characters in a novel.
One of the short cuts to dealing with large populations is to bucketize them into groups. The Boomer, Gen-Xer and Millennial labels are nothing more than a classification exercise based on birth years. We make generalizations about each group’s interests. Consider that Red Bull has the Millennials in their cross hairs while Metamucil aims at the gray hairs. This classification technique works well with whiskey comparisons too. Bourbon by law needs to have 51 percent corn. So in a blind test comparing spirits from different grains, *look* for the candy corn aroma. It’s a signature to this class of whiskey.
Propensity is a fancy term to describe what might likely happen. It’s how data analytics deal with the unknown. We see a drop in the price of oil and the propensity for Houstonians to leave the family cell phone plan rises. The same principle underpins whiskey when it comes to food pairing. Chicken piccata, an Italian dish served with a lemon and caper sauce, is likely to go well with a Rye. Why? Because the rye grain has lemon on the nose. So the propensity for the match is high.
Data analytics professionals, just like master distillers, like to experiment before they lock down on a data model. They have ideas and tweak as they go. Internet properties like Facebook play with the shade of blue to see which gets the most clicks. So why not expect that with whiskey? Whiskey blenders like Compass Box continue to push the envelope for their blended malts. They were famously cornered by the Scottish Whiskey Association for a rather unconventional aging process in the original recipe of Spice Tree.
Establishing a Baseline (Supervised Learning)
We know what is normal for blood pressure because doctors have measured this vital sign for years. More than that, they have correlated both positive and negative outcomes to the data. They know a patient is high risk because of the histories of hundreds of thousands of patients. The role of data analytics is to determine what is “normal” based on a given data set. However, normal becomes useful when we know the outcome of a certain event too. In the land of data analytics, when we establish a baseline with known outcomes and ask algorithms to pick out things that predict the future, we are engaged in supervised learning.
The act of building a baseline for whiskey tasting comes from personal experience. Blind tasting after blind tasting helps the taster single out the single malts from the blends. A corn mashbill from rye or barley. Secondary casking versus single. The more a whiskey taster experiences, the bigger the sample set, the broader the foundation, the more the taster knows. This foundational knowledge helps whiskey tourists know when they have left the paved road and are launched on an adventure.
Anomalies get a bad rap. That is until you understand that all parents want their kids to be normal, but never average. Being above average earns gold medals on the downhill and early acceptance to that hard-to-get-into college. This is not normal. Seeking anomalies is the job of talent scouts. It is also core to data analytics because it’s something from which we learn. It might be the use of a product the designer never expected. Ask the Pfizer about its original intent for Viagra.
The world of whiskey tasting presents a similar opportunity. Greenspot Irish Whiskey has green apple all over it. Westland Single Malt tastes like chocolate. And Hudson Four Grain has a barnyard quality to it. You can almost hear the sheep. When you hang notes in the air like Pavarotti, you get noticed. Anomaly detection is a different kind of appreciation. Whiskey aficionados aim for this.
Eighty percent of a data scientist’s time is spent wrangling data — filling in the missing elements in a table so the columns and rows are ready to be analyzed. It is hard to draw conclusions when the artifacts are missing. And this same rigor is pursed by the whiskey trade. Evaluating whiskey must be done if and only if the spirits are served in the same way and the same time.
We know that wine oxidizes in the glass. An angry glass of cabernet becomes approachable after an hour once it breaths. Time also plays into whiskey, except oxygen is not the factor. When alcohol evaporates from the glass precious olfactory volatiles escape with them. A whiskey freshly poured might be feisty at five minutes and friendly at 50. So it’s important that we treat data and whiskey with the same level of consistency: same glass shape, same pour size, same time out of the bottle.
Business data only gets better when we add diverse data types like geo-spatial, event-related or weather data to it. When a Texan shops online at happy hour on Cinco de Mayo and the cart was abandoned, there should be no surprise. Oh, it was sunny that day. This context-driven awareness adds a deeper understanding. Modern analytics is all about enriching structured data with unstructured to gain a better experience.
Likewise distillers are aging whiskey in second or third casks. They take a completed product and finish it off in wine, sherry, port, rum or Sauternes casks. Or it might take the form of the *blenders’ art* like Johnny Walker, Dimple or Compass Box. These distillers ladder up the experience by marrying whiskeys from different places. Hudson Four Grain ages the same spirit in three different sizes of barrels each with a different char to get that exact expression.
Shaped by Taxation
Fans of history have no trouble remembering when Alexander Hamilton rode to Western Pennsylvania in 1794 to help his boss lay down the Whiskey Rebellion. This was before Mr. Hamilton was a Broadway sensation. He was our first secretary of the treasury and was hungry to pay down the national debt with a whiskey tax <gasp>. Our friends in Scotland struggled with same issue in the mid-18th century where distillers were taxed based on still size and not production. In 1787 taxation was the tipping point in the split between the Lowlands and the Highlands. Religion, language and affiliation with England may have proper historians talking, but a true Gaelic Highlander knew the real argument was over whether blended whiskey was a *real* whiskey or swill to appease the English.
Likewise analytics in the English-speaking world found its voice because of taxation. First appearing in Britain in Roman times, the practice became a consistent effort in 1801. Its mission was for allocating precious resources. And since counting people one by one takes longer than a single decade, statistics found its place in economics.
So the next time you raise a Glencairn glass of the *water of life*, just remember that as you ponder notes of heather, sea air and the smell of warm biscuits, you might actually be thinking like a data scientist.
|Update your feed preferences|
Introducing a Single Product to Bundle Data Protection with VMware workloads in Amazon!
As more organizations continue to move applications and data to the cloud, the value of data protection is now more important than ever. Solid and reliable data protection workflows guarantee data is always available and ready for recovery when needed, and ensures little to no downtime for our customers’ business operations. For our customers virtualizing with VMware in the cloud, Dell EMC makes it easy to now protect VMware workloads on Amazon Web Services (AWS) with an all-in-one bundle that is cost effective and provides simplicity in purchasing and management.
Top Customer Advantages of the Dell EMC Data Protection for VMware Cloud on AWS” Bundle:
Top Partner Advantages of the Dell EMC Data Protection for VMware Cloud on AWS” Bundle:
Top Solution Benefits of Running Dell EMC Data Protection for VMware Cloud on AWS:
Ways to Learn More:
We are excited to go big and win big with you this year! Good selling!
|Update your feed preferences|
The recent news of side-channel analysis vulnerabilities affecting many modern microprocessors has, as you can imagine, generated more than a few inquiries from our customers about updating their PowerEdge servers. If you’re in the same boat, asking yourself “What comes next? How do I apply these BIOS updates?”, then this post should help.
First things first, applying a BIOS update to a PowerEdge server is easy. Dell supplies different tools so you can choose the method best suited to your particular IT environment and needs.
Updating One or Two Servers?
If you’re just updating one or two servers in a small shop, a BIOS update packages can be obtained from support.dell.com manually by keying in your server’s system tag and then looking for a BIOS update such as that shown in figure 1.
NOTE: Dell EMC downloads and driver updates are free. That’s always been the case and there are no plans to change that.
Downloading this file and then applying it manually to a local server is straightforward, but if you have hundreds or more servers in a remote data center you’ll want to keep reading because we have better options for you.
Updating Lots of Servers, Even Automatically
Intelligent Automation is a Dell EMC hallmark, and Dell EMC offers a range of OpenManage solutions that can simplify mass server updates. With Dell EMC Repository Manager, new updates from Dell EMC online catalogs can be automatically downloaded, as shown in figure 2.
You can tell Repository Manager when to download updates, which servers you own, and what kind of updates you want. You can also command Repository Manager to download different sets of updates for different logical or physical groups of servers, and then to separate them into repositories in different locations. This gives you the flexibility to support different deployment methods.
So now you have a BIOS update. You’ve tested it and you want to deploy it to the production servers in your datacenter. Now what? Dell EMC recommends one of the following approaches to automate updates:
As an example, OpenManage Enterprise, the next-generation Dell EMC management console, provides a simple click-and-go process to schedule and perform BIOS updates for thousands of servers (see figure 3).
Those systems will process the update as scheduled and with no further intervention. If you’re new to managing PowerEdge servers, this is an easy way to efficiently update thousands of servers without a lot of effort.
If you already manage your IT environment with an existing management platform such as System Center or vSphere, our integrations and connections make short work of incorporating PowerEdge servers.
And you use scripts to perform IT operations, we offer resources on Dell TechCenter as well as open source PowerShell and Python Scripting repositories http://github.com/dell. These assets provide a good starting point for automation, and can be adapted to the specifics of your IT environment.
Dell EMC Advantage: Dell EMC provides the tools to deploy updates in a manner that best suits your needs. We realize that one method does not fit all situations.
Dell EMC makes a variety of tools so you can perform server updates quickly and securely, particularly as part of an automated one-to-many update workflow. And because Dell EMC provides easy-to-use tools that integrate well with each other and with third-party tools, they are readily adapted to a variety of IT environments.
If you want to download a slightly longer version of this post, you can find it online at http://dell.to/2CpiSEg. For detailed, technical information on performing updates on Dell EMC PowerEdge servers, please visit this Dell TechCenter archive: http://dell.to/2o04cSn or for more on OpenManage systems management tools and technologies, please reference the Dell TechCenter wiki at http://dell.to/2w4myYE.
|Update your feed preferences|
The current draft of the coalition agreement for the new German government does not include keywords such as the Internet of Things or IoT; however, there are a number of references to Industry 4.0 that could be pointing to IoT. The passage talking about the “central goals are…the creation of open and interoperable standards” garnered much attention. It succinctly explains the actual, miserable situation of IoT; every service provider is following their own design for developing software for IoT solutions, machines, and robots. They could hardly be considered compatible with one another, which only catapults us back to the fragmented IT landscape of the 1980’s. Haven’t we learned anything? It is costly to implement these proprietary systems and operating them is ineffective. And it also enables hackers to flourish because it’s difficult to protect so many different devices and equipment.
However, the various manufacturers are not to blame. The problem is the lack of standardization, but of course, the question is how we can develop this if our sights are set on creating a new and dynamic market. There are many attempts being made by many players (manufacturers, for the most part) who are mixing up their own batch of standards in the meantime. Yet none has pushed to the forefront. Individual countries like France are even trying to develop their own standards. The next German government has also finally announced that it would start to tackle the issue (see above). It doesn’t really matter if these individual entities serve up specific results because it will only end up in a patchwork of standards at best, which defeats the purpose of standardization in IT. Cooperation with global standardization bodies and other countries will play a major role in the process. The current coalition agreement rightly states that “the development of common global standards and norms needs to be pushed forward.” But it’s one thing when politicians state their intent, and it can take a while before things are implemented.
Nevertheless, the delayed development hasn’t stopped market researchers from churning out forecasts using nothing but superlatives and publicizing their very optimistic forecasts. For example, BI Intelligence speaks of a five-year market volume of $6 trillion. Other analysts foresee very similar things.
It’s no wonder that we are seeing rapid growth. IoT offers many advantages: streamlined processes, more rapid response times on the market, predictive maintenance, improved capacities for machines/equipment, traceability of products, new and innovative markets, more satisfied customers, and lower costs overall for IT, product R&D, and companies.
However, these advantages exact their price: gigantic quantities of data. New analysis procedures based on machine learning and artificial intelligence are necessary in order to manage this amount of data and to extract high-quality (business) insights from it. Both technologies are being implemented more and more in IoT, and I am willing to bet that AI and IoT will soon be inseparable. I will even double down and predict that, in a few years, we will look back to a time when IoT and AI changed the very essence of how we live and how our economy works.
Automated driving provides a wonderful example. It will not only replace conventional vehicles and demonstrate the purest form of innovation, but it will also change the lives of millions of businesspeople who will no longer consider the time spent behind the wheel of their personal vehicles as wasted time. It will mean that they can finally combine their time spent at work with their commute more effectively. At the same time, millions of people with an aversion to driving or those unable to drive will experience tremendous gains in unlimited mobility again. Automated driving will shake the very foundations of our lifestyles.
Dell Technologies is also in the vanguard of IoT. Michael Dell confirmed this fact at a presentation of new IoT strategies last fall: “IoT is fundamentally changing how we live, how organizations operate and how the world works.” And these words have been heeded: A new business division for IoT has been created, and we will invest $1 billion in the Internet of Things over the next three years.
|Update your feed preferences|