The Attunity Blog
This week, we announced the availability of a new release of Attunity Compose, Attunity’s data warehouse automation software. This new release offers significant enhancements for enterprise customers including 10x faster extract, transform, load (ETL) processing speeds as well as advanced DevOps capabilities that streamline the data warehousing design, development and rollout processes.
By now, Attunity has established the fact that its database replication product, Attunity Replicate, is a high-performance data replication solution supporting a wide range of data sources and targets. As the ROI report by Nucleus Research states, "With Attunity, one organization was able to reduce the data management update process from 10 hours to 20 minutes", there is no doubt that Attunity Replicate can deliver quick time-to-value for any data integration project.
But is it possible to achieve even better results and raise the expectations even higher? Yes, it is! With every...
By now you’ve likely read the articles about the recent vulnerability uncovered in OpenSSL that has affected vendors and companies that rely on this near-ubiquitous open source security protocol.
The soothing power of music is well-established. It has a unique link to our emotions, so it can be an extremely effective stress management tool. And when stress decreases, productivity increases..
You are probably asking yourself, what does this have to do with Big Data management in your organization? Let’s take a look at this process…
The MySQL is widely labelled as the "world's most popular" open source database due to its high performance, high reliability and ease of use. For Facebook, Google, Adobe, Alcatel Lucent and Zappos it is used to run what are specifically called out as "high volume web sites" as well as the firms' business-critical systems.
In my last blog , I highlighted the many challenges that organizations are facing with legacy data transfer, The IT industry is responding to challenges via "Legacy modernization" and "legacy transformation". This is the act of reusing and refactoring existing core business logic by providing new user interfaces or by selectively moving data from the legacy systems to the modern data warehouse systems (where data is integrated and analyzed with data coming from ‘modern sources’).
What are legacy systems? Legacy systems are simply incumbent computer systems that are both installed and working. In other words, the term is not pejorative, but the opposite. Bjarne Stroustrup , creator of the C++ language, addressed this issue succinctly, if not with a little sarcasm to drive his point home: "Legacy code" often differs from its suggested alternative by actually working and scaling.
I am thrilled to be leading a discussion this Friday, November 1 at this year's MassTLC’s Innovation unConference on overcoming the challenges of making Big Data useful -- and I hope to see you there!
Organizations cannot always avoid disasters, but with careful planning, the effects of a disaster can be minimized.