Tim Gorman, Delphix
DevOps is streamlining the movement of IT systems from development through testing into operations. The biggest constraint preventing this way of doing things is data. Databases and application stacks have grown enormous. Provisioning environments for each developer or tester become unrealistic when each environment might be hundreds of terabytes in size and agile methods require new environments every two weeks. And so developers and testers end up crowding into shared environment subsets that are refreshed only every few months, if at all.
Data virtualization is a solution for that data constraint. Now it is possible to provide each developer and tester with a complete private read-write image of the full application stack and database for each task and project, regardless of size. Provisioning full environments effortlessly allows developers and testers to work concurrently on different tasks with all the resources they need and the ability to version entire environment as well as their code changes. Agile development methods, using DevOps techniques for continuous delivery of higher-quality systems, require data virtualization as a tool as much as they require other tools such as Chef, Puppet, VMware, VirtualBox, or Rally.
This presentation will describe the data constraint and its impacts on IT, and then it will explain the solution, including the details of how data virtualization works. The attendee will probably recognize the impacts of the data constraint in their work.
Become a member of ODTUG to gain access to more than 12,500 files in our technical database.
- Not a member? Click through to the topic of interest to browse a list of available presentations.
- Already a member? Log-in here to access the full database
Click here to see the Full Techincal Resource Database