Trending February 2024 # Federal Cio Talks Up Government In The Cloud # Suggested March 2024 # Top 6 Popular

You are reading the article Federal Cio Talks Up Government In The Cloud updated in February 2024 on the website Daihoichemgio.com. We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested March 2024 Federal Cio Talks Up Government In The Cloud

WASHINGTON — Though he knows it’s a slow ship to turn, Vivek Kundra is adamant that the federal government will shift its IT infrastructure to the cloud-based model that has been transforming the private-sector enterprise for much of the past decade.

Addressing the subject Wednesday morning here at the Brookings Institution, the country’s first federal CIO voiced a mixture of bewilderment at the government’s failure to keep up with the private sector in cloud computing and resolve to close the gap.

“What I would submit to you is that part of the reason is because we’re focused on building datacenter after datacenter, procuring server after server, and we need to fundamentally shift our strategy on how we focus on technology across the federal government,” Kundra said.

Kundra’s speech came on the same day that the administration hit a milestone in its open government initiative. The Office of Management and Budget had set today as the deadline for all the federal agencies to publish their plans for making data sets publicly available on the Web in a machine-readable format and offering greater transparency into their operations.

But Kundra’s talk today was focused on the substantial cost savings that can be gained by eliminating the staggering inefficiencies in the federal computing model, where he said server utilization rates are as low as 7 percent, a situation he called “unacceptable.”

“We need to find a fundamentally different strategy as we think about bending this curve as far as datacenter utilization is concerned,” he said.

Estimates of the cost savings to be had by migrating to a cloud model vary widely, particularly in the federal government, where much of the material locked in datacenters relates to national security or contains personal information about citizens that wouldn’t work in a Salesforce-style cloud environment.

Nevertheless, Kundra was emphatic that the fiscal benefits of cloud computing for non-sensitive data are substantial. The Brookings Institution today released a study estimating that agencies stand to save between 25 percent to 50 percent of their IT budgets by phasing out private, in-house file servers and moving to the cloud.

As it begins its move toward the cloud, the government is actively courting the private sector to participate in the standards-setting process that will establish certification requirements for security, data portability and interoperability.

“Security is clearly the biggest barrier,” Kundra said. “Data portability is another barrier because we don’t want to lock the federal government into one vendor.”

On May 20, the National Institute of Standards and Technology is planning a cloud summit, offering vendors in the private sector a seat at the table as it begins work on setting cloud standards, which Kundra said is a critical step toward achieving the structural efficiencies he envisions, rather than simply “Webifying our current infrastructure.”

“What this moves us away from is every vendor having to go out there and certify from agency to agency, bureau by bureau, which is going to drive up the costs, and frankly doesn’t necessarily move us to a posture that creates better security,” he said.

Kundra spoke with a certain urgency about the need to move to the cloud, in part due to the rapid proliferation of data that federal agencies are creating and storing. Over the past decade, the number of federal datacenters has increased from 493 to 1,200, and hardware, software and file servers account for more than a quarter of the federal IT budget, according to the Brookings report.

At the same time he is a realist, acknowledging that the hulking federal IT apparatus is not the sort of thing that can be immediately reformed courtesy of an executive order or congressional mandate.

“This shift to cloud computing is not going to happen overnight,” Kundra said. “This is a decade-long journey.”

Kenneth Corbin is an associate editor at chúng tôi the news service of chúng tôi the network for technology professionals.

You're reading Federal Cio Talks Up Government In The Cloud

Hybrid Cloud Adoption Strategy For Cio

The hybrid cloud market is expected to grow from US$44.6 billion in 2023 to US$97.6 billion by 2023.  Overview of Hybrid Cloud

Today, business is changing continuously and technology is trying to catch up with these business agilities to meet to the time to market.  Organizations are expanding their usage of cloud to maintain their competitive edge, accelerate the innovation and transform interactions with customers, employees and partners. The Pandemic has further increased the demand for speed of delivery and scale of cloud adoption.  

According to International Data Corporation (IDC), “By 2023, over 90% of enterprises worldwide will be relying on a mix of on-premises/dedicated private clouds, multiple public clouds, and legacy platforms to meet their infrastructure needs.”

Markets and Markets research quoted that, “the hybrid cloud market is expected to grow from US$44.6 billion in 2023 to US$97.6 billion by 2023, at a Compound Annual Growth Rate (CAGR) of 17.0% during the forecast period.”

CIOs across the industries are busy working with multiple cloud providers, essentially to retain what works and what to improve across Organization’s Cloud estate. One of the fundamental decisions they need to make is how to balance the on-site, remote, public and private elements of that combination. CIO’s need to derive a strategy to adopt emerging technologies to provide more business value than before. 

In addition, the CIO’s security concerns regarding adoption of public cloud providers are going down while the concerns regarding Vendor lock-in are trending up. The organizations does not want to put all its eggs in one cloud provider’s basket, to minimize the huge dependency.  

The major factors that CIO’s need to consider before promoting Hybrid Cloud model across the organization

are,   

Do Organization has a right business culture to embrace rapid change

What value the Hybrid Cloud bring to Organization business

Will it help organization to lower costs, improve processes and better manage security  

Organization readiness for New Service Model

Organization capability to address change management in moving to a hybrid cloud adoption while maintaining business continuity

Do organization has proper use cases for hybrid cloud adoption.  For ex: usage of packaged applications, disaster recovery setup across the multiple geographies etc. 

Volume of  requirement for new application development on existing architectures, development of next generation applications

Hybrid Cloud addresses all the above CIO’s concerns. Hybrid cloud is one of the quick solution to address Agility and Speed in terms of choosing the right workload for the right environment. 

Hybrid cloud is not a product or software service.  It is an approach to cloud computing that includes a combination of private cloud, public cloud and/or on-premise environment. The combination may include infrastructure, virtualization, bare-metal servers and/or containers. 

Key Considerations for Adoption of Hybrid Cloud 

Hybrid Cloud plays a key role in increasing speed of delivery of IT resources to end-users, improve disaster recovery capabilities and for better resource utilization.  Some of the key considerations for the Hybrid cloud adoption are, 

Visibility of Current State:  Assess the current application landscape and infrastructure across the organization to decide on the cloud migration. Complete application portfolio analysis of the Organization need to perform to decide on the cloud adoption. 

Rate of Cloud adoption: Big bang approach is not going to fly. Based on the criticality of the applications the decision on the movement on the cloud Vs timelines need to be decided. The factors that decide on the rate of cloud adoption are, complexity of the application, data requirements, regulatory and compliance needs, modernization prerequisites, cost implications, real time requirements etc. 

Portfolio Rationalization: Identify the business functions and the applications that are catering to these business capabilities. Re-engineer these business functions based on industry trends and Merger/acquisitions. Identify the redundant functionality across the applications and rationalize them before moving on to the Hybrid cloud.  

Application Migration: Applications can be migrated “To and from” the data center and cloud using Hybrid Cloud model. Identify the applications that need to remain on-premise, or move to private or public cloud. This migration can be temporary or permanent depending on the enterprise strategy of migration.

Nature of Applications: Applications that change frequently need to be move to cloud to leverage automated deployment through adoption of DevOps.  Applications that handle sensitive data to be best retained on premise. Applications with very high scalability requirements because of varied user load seasonal or timely are ideal to be hosted on the public/private cloud

Selection of Environments: Public cloud environments are cloud service providers may not provide the specialized hardware/application components and thus not a viable option to move the application to cloud. The core idea is to choose the best environment for the application to run to deliver the functionality at the most optimal cost and effort

Integration Strategy:  Need for connecting back the applications to historical data that resides on the on-premise servers still exist, even though the migration to cloud happens.  Various diverse integration patterns emerge as part of the modernizing the applications, handing multitude data sources covering structured and unstructured data. All these activities demand the refinement of integration strategy to be follow by the Hybrid cloud across the Organization. 

Regulatory Requirements: Applications requiring regulatory and compliance requirements demand some of the applications/data to reside on premise. This requires deeper due diligence to select the right candidates for Hybrid cloud and the right hosting option for the application. 

Containerization: Containerizing the applications would help in making the application cloud agnostic and move across public, private, and on-premises clouds. Containers and Microservices go together like Idly and Sambar, if we prepare wisely. Microservices and containerization of these services deployed under hybrid cloud environment. 

Cloud interoperability: Integration between several cloud offerings across multiple cloud service providers and cloud types is the key consideration for the  success of the hybrid multi cloud adoption

Benefits of Hybrid Cloud Adoption

Hybrid Cloud helps the Organization with increased flexibility to deliver IT resources, improved disaster recovery capabilities and lower IT capital expenses. The other benefits to the Organization are,   

Business Acceleration: helps in Speed up business processes, support collaboration, and provide cost-effective solutions to free up IT budget for innovative, revenue-generating projects

Cost Reduction: helps in reducing operating and capital costs and improves performance, productivity and business agility via a flexible, scalable solution. Organizations can choose the applications to move across the clouds  and on premise based on their organization requirements

Reliability: in this model, if one cloud goes down, some functionality will still be available to users from the other deployed clouds. Generally, one public cloud could be used as backup to another cloud

Risk Management : helps in mitigating risk with a single, unified, cybersecurity solution

Manage Legacy Systems : Rather than replacing legacy systems, hybrid Cloud can bridge the gap between legacy and new, providing major cost savings

Scalability: Applications can scale infinitely through adopting a hybrid multi cloud strategy while keeping the core of the business & data secure through on-premises hosting

Conclusion

Hybrid cloud adoption helps the Organization to lower the infrastructure footprint, improves security, increase resilience, zero down time and acceleration in Time to Market. 

Therefore, Hybrid cloud is not just about delivering cost savings. It is about the Organization becoming more agile, efficient and productive. Organizations of any size can adopt Hybrid cloud that helps in cost efficient delivery of the business. 

Acknowledgements

The authors would like to thank Vijayasimha A & Raju Alluri of Wipro Digital Architecture Practice of Wipro Ltd for giving the required time and support in many ways in bringing  up this article. 

Author

Dr.Gopala Krishna Behara is a Distinguished Member and Lead Enterprise Architect in Wipro Technologies with 25+ years of extensive experience in the ICT industry. He serves as an Advisory Architect, Mentor on Enterprise Architecture, Application Modernization and continues to work as a Subject Matter Expert and Author. He is certified in Open Group TOGAF, AWS Solution Architect -Associate, IBM Cloud Solutions and UNPAN. Published number of research papers, books in IT industry. He has been a speaker at National and International forums and bodies like The Open Group, National e-Governance Forum. He has been a moderator and panel member for multiple technical/business forums like IGI Global, AEA, Open Group and Premium College meets. Recipient of EA Hall of Fame International Award – Individual Leadership in EA Practice, Promotion and Professionalization Award. 

Disclaimer

Backing Up Your Business: How To Successfully Tier Data In The Cloud

by Devon Cutchins

There is no one-size-fits-all solution when it comes to data storage. Most of the companies I talk to have accepted that utilizing cloud-based storage should be a part of any well-crafted data management plan.  While many companies dive into the cloud by choosing one solution that meets the minimum requirements for all of its data, I would argue that a more prudent approach is to segregate that data based on the requirements for use of the data.  In doing so, you can leverage multiple cloud solutions to provide the best performance and value for your company.

Cloud Storage and Backup Benefits

Protecting your company’s data is critical. Cloud storage with automated backup is scalable, flexible and provides peace of mind. Cobalt Iron’s enterprise-grade backup and recovery solution is known for its hands-free automation and reliability, at a lower cost. Cloud backup that just works.

SCHEDULE FREE CONSULT/DEMO

Understanding the difference between hot and cold data

As a colleague of mine is fond of saying, “Fast, cheap or good.  Choose two.  You can’t have all three.” Data solutions naturally fall along a performance and cost continuum. Solutions on the “hot” end of the spectrum feature high performance and high availability, which are ideal for mission-critical data that needs to be accessed quickly and frequently with zero downtime. Solutions on the “cold” end of the spectrum are more cost-effective per GB and are ideal for data that is rarely accessed, or that can afford to be accessed with higher latency.  

Why even keep cold data if you don’t access it, you ask? Often, cold data is kept to meet specific and important industry regulations and compliance requirements, such as HIPPA. Organizations that choose not to keep their cold data often open themselves up to regulatory infractions and possible fines.

The first step in choosing your cloud storage solution should be to determine where on the spectrum your data lies, i.e. how to “tier” your data.  To do this, it’s important to understand the different solution tiers and how they differ in performance.  Considering what storage tiers best meet your needs is the best approach to achieving a successful, ongoing storage strategy.

Tiering your hot, warm and cold data in the cloud

Tier 0 – Tier 0 is relatively new, since the cost and reliability of solid state drives (SSDs) has only recently made them practical in the mainstream market.  In terms of hot and cold, this storage option is smokin’ hot.  Solid state drives can deliver up to 30 times the performance of the fastest spinning hard drive, while using a fraction of the space and power.  In Tier 0, you’ll experience extremely high input/output operations per second (IOPS) and extremely low latency, which results in accelerated online transaction processing and shortened batch processing, for example.  Unfortunately, SSD’s still come at a very high cost. Tier 0 is the fastest and the best, which means it is also the most expensive per GB.  

Tier 1 – Tier 1 is usually comprised of redundant, high performance drives, offering quick response times and fast data transfer rates.  These solutions offer very high IOPS and very low latency, but high cost. Tier 1 solutions should be used sparingly, but are typically an essential element of any enterprise-class company’s portfolio.

Tier 2 – Tier 2 is your primary tier for active data, and probably is where most of your data should live.  It typically features enterprise SATA drives that favor capacity over performance, with medium IOPS, medium latency, and medium cost.

Tier 3 – Tier 3 storage should be used for data that is rarely accessed but still needed. In other words, Tier 3 is your first tier for archiving and can be classified as warm storage on the spectrum. These solutions feature low IOPS and relatively high latency, but hopefully low costs.  The performance of Tier 3 solutions still crushes that in Tier 4.  This is perhaps the most underserved tier in the marketplace with vast price ranges.  There are, however, several affordable Tier 3 solutions available with excellent archiving performance.

Tier 4 – Tier 4 is your final line of cold data tiering, and should be reserved for your “offline” data since it’s almost always stored on tape or off-line drives. There are very low to no IOPS, extremely high latency (order of days), and extremely low cost. This tier should be reserved for data that is being kept for backup or compliance purposes only.  Watch out for recovery costs if you plan to use a Tier 4 solution.  Although this can be the cheapest storage on the market, many Tier 4 solutions have a high cost associated with recovering your information.

Storing data in its optimal tier

To best optimize your data, your organization must agree on storage requirements. This is the final frontier in achieving and implementing a cloud storage strategy that best fits your company’s needs. You must also consider the importance of disaster recovery, energy consumption, IT manpower – and the related cost implications. For example, not every file needs to be encrypted or de-duplicated, and implementing those features on every file will only raise costs and lower performance.

At the end of the day, by taking the time to understand the data you want to store in the cloud, which tier it best fits in, how often you’ll need to access it, its place among your business activities, and the features you require to store it, you’ll be on the path to successfully backing up your business assets and utilizing the cloud to its fullest potential.

Devon Cutchins is the SVP and CLO of Markley Group

Photo courtesy of Shutterstock.

Microsoft Sees Opportunity In Shrinking Federal Budgets

WASHINGTON — As federal agencies put their budgets under the microscope looking for items to trim, under-producing IT projects could land on the chopping block.

Federal CIOs are under pressure from the White House tech team to eliminate inefficient tech deployments, and either overhaul or abandon projects that are running over budget or behind schedule.

That means that IT firms looking to do business with the federal government are going to have to prove their case, according to Teresa Carlson, vice president of Microsoft’s (NASDAQ: MSFT) federal government division.

“I think in the federal [market] we have to prove our worth now,” Carlson said in a presentation here at Microsoft’s annual Worldwide Partner Conference. “Business cases are going to become extremely important.”

Greg Myers, the general manager of Microsoft’s federal civilian business, said that the company is pitching federal IT managers on cloud-based technologies that it argues are more agile and entail more flexible operating agreements and lower cost than enterprise infrastructure solutions from companies like Oracle (NASDAQ: ORCL) and VMware (NYSE: VMW)

Myers also described a growing tension in the government IT sector in response to the Obama administration’s emphasis on bringing more data and services online while still maintaining and strengthening the security of sensitive data. That friction is particularly acute in areas such as health care, where Microsoft and scores of other firms are vying for government contracts in areas such as electronic health records (EHR).

“You’re seeing this violent collision of being as open as they possibly can on the civilian side, especially in healthcare, and you’ve got, obviously, security [which] is paramount,” he said.

“The CIOs…are really struggling with how to serve both masters. It’s one thing if someone hacks a Google mail account or a Facebook account. It’s quite another if someone gets into your EHR,” he added.

By convening its partner conference in the nation’s capital, Microsoft executives have had a chance to hear about the government’s IT priorities directly from the officials making the decisions.

CEO Steve Ballmer has been making the rounds in Washington this week, meeting with the deputy secretaries at the president’s management council and sitting in on a CIO roundtable. Ballmer also met with senior officials at the Department of Veterans Affairs, and stopped in at Walter Reed Hospital to hand out Xboxes and Zunes to soldiers wounded in combat.

One of Ballmer’s messages to the CIOs was to shorten the development and procurement cycle, Carlson said. Within Microsoft, projects that drag on for more than a year are closely scrutinized, while none is generally allowed to languish for more than two and a half years. But in government IT, historically, it has not been uncommon for projects to crawl along for several years.

“They were blown away by this,” Carlson said. “One of the things the CIOs brought up that they’re struggling with is the culture shift.”

Federal CIO Vivek Kundra has been leading an effort to evaluate under-performing or over-budget IT projects across the agencies, while also trying to bring more transparency into how much money the government is spending on various projects. Just yesterday, Kundra debuted the revamped IT dashboard, an online tool that tracks federal technology projects throughout the agencies and departments.

“They really are paying close attention to these large projects that aren’t working,” Carlson said. “They’ve already canceled one with the VA. There’s going to be more of these.”

“Cost really is king,” said Kris Teutsch head of Microsoft’s national security group. “That’s driving the behavior of procurement.”

Kenneth Corbin is an associate editor at chúng tôi the news service of chúng tôi the network for technology professionals.

Office Productivity In The Cloud With The Ipad

I pre-ordered the 32Gb Wi-Fi model of the Apple iPad, and I have been immersing myself in what it can, and can’t do, for the past couple days since it arrived. I am still trying to test the boundaries of what the iPad can do as a mobile business tool and determine its limitations as a notebook replacement.

But, the question before me is whether or not the iPad has what it takes for me to leave the notebook at home and rely on the tablet device for business productivity functions as well. One of the weaknesses of the iPad is the limited storage capacity. Its not expandable, so whatever you bought is what you’re stuck with–on the device itself at least.

The iPad does have access to the cloud, though. And, the cloud, unlike the iPad, has virtually limitless storage capacity. Many small and medium businesses are already leveraging cloud-based apps for office productivity with Google Docs–a natural fit for the iPad.

With my Wi-Fi only iPad, though, I can’t count on always being connected to the cloud. Thankfully, Memeo has an app to solve that problem sort of. Memeo Connect Reader syncs your documents from Google Docs so they are available on the iPad even when its offline. You can view the docs no matter where you are–connected or not–in native formats for Microsoft Office, Apple iWorks, PDF files and more.

Problem solved–assuming you use Google Docs and that you don’t want to create or edit any docs while offline. Memeo Connect Reader just views, otherwise it would probably be called Memeo Connect Editor.

You can also use the iWork for iPad apps to edit docs you have synced up with the iPad using iTunes when you have it connected to your desktop or notebook. However, if you’re out and about and suddenly need to edit a doc that you didn’t have the foresight to sync to the iPad, you’re more or less out of luck.

Unfortunately, there doesn’t seem to be any way–at least no simple or intuitive way–to grab a file from cloud-based storage and open or edit it with the iWork apps. If you don’t have iTunes to import the file and sync it with iWork, then you won’t be able to edit it on the go.

However, my trials and tribulations trying to work with simple business productivity files has demonstrated some of the reasons that the HP Slate tablet could be a much better business tool than the iPad. However, the functionality has to be balanced with the weight and battery life of the device, two areas that the iPad excels in.

Another option, though, could be to use remote desktop solutions like Array Networks or Core Plus to simply stream your desktop or netbook to your iPad. Then, as long as you have a Wi-Fi (or 3G) connection, you can simply use the software you are used to using, and have access to all of your files and data no matter where you might be.

Tony Bradley is co-author of Unified Communications for Dummies . He tweets as @Tony_BradleyPCW . You can follow him on his Facebook page .

The Government Shutdown Has Halted Obama’s $100M Brain Initiative

Since April, the neuroscience community has been gearing up for the Brain Research Through Advancing Innovative Neurotechnologies initiative (BRAIN), the Obama administration’s ambitious plan to map the human brain. Its goal is to “accelerate the development and application of new technologies that will enable researchers to produce dynamic pictures of the brain that show how individual brain cells and complex neural circuits interact at the speed of thought.” The BRAIN initiative is being led by the National Institutes of Health, the National Science Foundation and DARPA. NIH, which pledged $40 million to the initiative, was due to start allocating its BRAIN funding to scientists soon, but now that the government shutdown has furloughed most of its staff, the project, already on a tight schedule, has been put on pause.

At its inception, the BRAIN initiative’s goals were nebulous, and immediately several scientists were called together by NIH to figure out what exactly the BRAIN initiative should be funding, and how it should go about achieving this grand vision. So over the course of just a few months, the 15 volunteer members of the “dream team” working group held seven meetings, including workshops where they invited experts from around the country and the world, to talk about the future of neuroscience and establish a more concrete plan for the initiative. They spent thousands of hours trying to put together the interim report requested by NIH Director Francis Collins by their September 16 deadline, designed to give NIH time to solicit proposals, review them and give out grants. But now, with NIH at a standstill, the people responsible for making sure NIH’s portion of the BRAIN initiative gets off the ground can’t work.

‘NIH wants to deliver on its end of the bargain, but they simply can’t do it if they’re sitting at home on an unwanted furlough.’

One of the initiative’s co-chairs, Stanford University professor Bill Newsome, told me: “The government shutdown will very definitely affect BRAIN–will bring it to a complete halt in fact.”

“To write good proposals, to get them evaluated, to get the money committed for this next year flowing, that’s a long process–even with the NIH process moving at warp speed, it takes the better part of a year,” he explains. “We on the working group, we delivered our end of the bargain. NIH wants to deliver on its end of the bargain, but they simply can’t do it if they’re sitting at home on an unwanted furlough.”

House of Representatives

“If this stoppage is protracted, the start of the BRAIN project in 2014 will definitely be at risk,” he wrote me in his initial email. When I reached him later by phone, he explained that if the government gets back to work relatively quickly, the situation won’t be so dire. “I think if people were called back to work on Friday, this whole thing becomes an incredible nuisance, but it’s not a showstopper.” Still, there’s a lot of money at stake here.

‘I just know that this is no way to run a government, and it’s no way to run support for science…it’s pretty much a disaster.’

The $100 million President Obama promised when he announced the initiative back in April was due to start rolling out in Fiscal Year 2014 (which started on October 1, yesterday). The first $40 million was coming from NIH, which was responsible for coordinating the calls for proposals, approval of projects and getting the funding rolling. “It’s a matter of getting that first $40 million out on the street, and getting the scientists to work on developing new technology and developing new understandings of normal brain function,” Newsome says. “And those understandings of brain function are critical to understanding what goes wrong in neurological and psychiatric disorders. Every month and year we delay in getting this going are going to have consequences.” Putting off this big of a scientific endeavor could mean delaying the potential therapies that would come out of it.

Right before the government shutdown, Newsome and his co-chair, Rockefeller University neuroscientist Cori Bargmann, received word from NIH warning them that it would also mean the shutdown of the BRAIN initiative’s progress within the agency. “We have two key contacts working with our committee, and they both sent emails to Cori Bargmann and myself to say they were going dark–we would not be hearing from them until the government was operational again,” he describes.

“The whole thing is just at a complete standstill. I don’t know what to say,” he explains. “I just know that this is no way to run a government, and it’s no way to run support for science…it’s pretty much a disaster.”

Update the detailed information about Federal Cio Talks Up Government In The Cloud on the Daihoichemgio.com website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!