This article is part of an Essential Guide, our editor-selected collection of our best articles, videos and other content on this topic. Explore more in this guide:
1. - OpenWorld 2013 : Read more in this section
- Oracle HCM Cloud strategy explained
- OpenWorld 2013 wrap-up: Oracle swipes at SAP with in-memory database
- OAUG on new features in Oracle EBS 12.2
- DBA: Database 12c Multitenancy far superior to transportable tablespace
- Best practices for Oracle Solaris upgrades detailed at OpenWorld
- Cloud, enterprise social networks growing on CFOs
- Getting kids started with coding
- Strong app architecture with data warehouse concepts
- What to do when activity and data volumes spike
- Oracle customers tout big data success
- Database as a service: A new offering from Oracle
- Oracle gets more serious about the cloud with 10 new services
- Incorporating big data gives a competitive edge, according to Intel
- Oracle EMEA update: Oracle continues to push cloud
- Businesses need strong digital conversations among C-level executives, says PwC
- Tesco uses multi-channel approach to create fluid customer experience
Explore other sections in this guide:
As an Oracle ACE Director, Gurcan Orhan knows a lot about moving data around. He's got more than a decade of experience working with data warehouse concepts, including a considerable amount of integration experience. Orhan is really a data guy, and his concerns in this process is that the data in production is accurate and that it can be transferred as necessary. His upcoming session at Oracle OpenWorld 2013 will elucidate the issues around maintaining the data for all your teams' projects.
ODI is a toy with different uses. It depends on what game you want to play.
director, Oracle ACE
The simplest development projects can be built with a single master data repository. It sits on one server and each of the three major groups -- development, test and production -- has its own context, or view, into the code. With this sort of setup, the team has to be very careful about making changes, because a small misstep in the master repository may break the live application. But with small teams and simple projects, keeping repository errors from popping isn't too difficult, and fixing them when they do pop up is similarly feasible. The data warehouse concepts don't really come into play yet.
As teams and projects get more complex, Orhan said, it quickly becomes necessary to separate the production environment from development and testing activities. This separation protects the active application from potential mishaps that might happen due to miscommunication between teams. "At this point," Orhan explained, "We have our execution repository separated out on a second server, but the master repository still lives on its own server."
This second setup does use some data warehouse concepts, but is still not big enough for most enterprises. It assumes that there is a limited development team and no dedicated software QA team. The developers are responsible for all their own testing.
Data warehouse concepts for the enterprise
Most enterprises need a dedicated quality team. Adding a third team means adding a third environment with its own version of the data. There's one more server and one more repository. Orhan said this is where most enterprise development teams are at today.
There are enterprise development teams, however, that need an even larger and more complex system. When enterprises reach a certain size, it becomes difficult to track all the changes that happen. In addition, replicating a large database for multiple teams can become a problem for storage and computing capacity.
Usually, any given development team will only work with a small portion of the full database. The test environment will usually work with a wider range of the stored data, but by no means the entire database. For organizations that are running into this size barrier, Orhan suggests investing in a preproduction environment.
The preproduction environment matches as closely as possible what's really running on the production servers. But it's still not the actual live environment, so mistakes can happen there without causing dire consequences. So after the quality team has tested it all and it's passed all of their checks, the operations team can deploy the code the same way they will to make it live and make sure nothing breaks.
Customize data warehouses for optimum performance
Orhan may be a little biased because of his Oracle ACE Director status, but he recommends using Oracle Data Integrator (ODI) for data extraction, transformation and loading tasks. When asked what specifically ODI does, Orhan was a bit vague, but he did say that it is a versatile middleware tool. "ODI is a toy with different uses," he said. "It depends on what game you want to play." In Orhan's opinion, ODI is the right tool for implementing most data warehouse concepts for the development team.
Orhan also mentioned that ODI may not be at its best for any one particular organization right out of the box. There are knowledge modules that make ODI more customizable. Orhan claimed he was able to modify the knowledge module that Oracle regularly uses in order to reduce a data transfer that would have taken more than 10,000 seconds to less than 200 seconds. He's keeping the particulars of how he did it a secret, but he said he does divulge some ODI knowledge module tips on his blog.
Orhan admitted that his experience working with data integrations in Turkey flavors his views on data integration. In Turkey, according to Orhan, large companies never use any software solution exactly the way it comes. He claimed that the high rate of customization in Turkey is partly because these suites don't put much focus on meeting the specific needs of Turkish businesses and partly because the trade laws change frequently -- sometimes major changes come around with less than two years in between. "It's never SAP or Oracle or JD Edwards out of the box," he said. "These big suites are always customized."