OmniData https://omnidata.com Simplify Complex Data Mon, 14 Sep 2020 19:58:23 +0000 en-US hourly 1 https://wordpress.org/?v=5.5.1 Dashboard in a Day Workshops Scheduled https://omnidata.com/dashboard-in-a-day-scheduled/?utm_source=rss&utm_medium=rss&utm_campaign=dashboard-in-a-day-scheduled Thu, 23 Jul 2020 23:01:10 +0000 https://omnidata.com/?p=1737 OmniData joins Microsoft in offering Free Dashboard in a Day Discovery Sessions Tobias Eld, OmniData partner and Azure guru is certified for Microsoft’s Dashboard in a Day Program. As a top PowerBI expert, he will guide ambitious clients through best practices in PowerBI. If you want management dashboards, PowerBI is Microsoft’s number one next generation…

The post Dashboard in a Day Workshops Scheduled appeared first on OmniData.

]]>
Free August and September classes available now

OmniData joins Microsoft in offering Free Dashboard in a Day Discovery Sessions

Tobias Eld, OmniData partner and Azure guru is certified for Microsoft’s Dashboard in a Day Program. As a top PowerBI expert, he will guide ambitious clients through best practices in PowerBI. If you want management dashboards, PowerBI is Microsoft’s number one next generation BI tool. If this is you, schedule to visit OmniData for this on-line class in August or September. “This is an ideal session for clients seeking an overview of where PowerBI is now and where it will take your company in the future. You will come away with an understanding of the direction your business should take to build a BI program with PowerBI as the foundation,” Mr Eld said. “We’re excited we can provide this overview with the lab support of Microsoft.”

Microsoft trained top Azure Data Gold partners for this program earlier in 2020. Mr Eld was among the first Senior Consultants to receive certification. When you’re ready, sign up for this terrific one day program, click here.

What You’ll Learn

Why participate?
At the end of the day, you will have learned more about:

  • Organize your structured and unstructured data for presentation of insightful visuals in PowerBI
  • Define organizational KPI’s to drive strategy.
  • Expand on your org’s KPI’s with drill-down to detail instant insight
  • Generate reports using Power BI for top management or immediate performance feedback.

About OmniData

OmniData provides products and services at every phase of the data lifecycle. When you need us, we are passionate about your success. We mine your hidden data assets and we will accelerate your time to data insights 10X.

The post Dashboard in a Day Workshops Scheduled appeared first on OmniData.

]]>
Free Excel, Power BI, and Azure Training Offered by Microsoft https://omnidata.com/free-microsoft-traiing/?utm_source=rss&utm_medium=rss&utm_campaign=free-microsoft-traiing Tue, 30 Jun 2020 20:18:44 +0000 https://omnidata.com/?p=1729 OmniData Teaches all Four One Day Trainings Microsoft has structured free training in the Azure Analytics Stack into four single day classes. The classes cover all aspects of what goes into a world class, next generation business intelligence program. OmniData, as a gold Microsoft Partner, will participate at all levels of the multiyear training strategy.…

The post Free Excel, Power BI, and Azure Training Offered by Microsoft appeared first on OmniData.

]]>
OmniData Teaches all Four One Day Trainings
Azure Analytics Classes
Learn all aspects of the Azure Analytics offerings

Microsoft has structured free training in the Azure Analytics Stack into four single day classes. The classes cover all aspects of what goes into a world class, next generation business intelligence program. OmniData, as a gold Microsoft Partner, will participate at all levels of the multiyear training strategy. In these training classes, the goal is to de-mystify at all levels of your business what goes into best practices of an enterprise-wide business intelligence program. The result: participants will increase data literacy at all levels of their business.

OmniData’s top consultants will be the trainers in all four of Microsoft workshop programs. The four programs are: “Dashboard in a day”, “Modern Excel Analyst in a Day”,”Paginated Reports in a Day”, and “Analytics in a Day”. Because there are so many different audiences, each program reflects a different level of data literacy. Furthermore, each program is a different bridge to the Modern Data Estate for an individual at a particular place in the journey to data literacy.

Dashboard in a Day

Dashboard in a day is where the program began. Microsoft rolled this one day class out several years ago with great success. By 2019, the realized just how beneficial the class was. The class brings customers, partners, and Microsoft itself close together, while creating a foundation for success with the product. Aimed at analysts, the student will learn in just one day how much further their contribution to their organization can be with this ever-evolving product.

Modern Excel Analyst in a Day

Modern Excel Analyst in a Day is designed for the massive audience that continues to require excel in its business reporting eco-system. Compared to Power BI, Excel has been the analytics tool standard in the workplace since the mid 1980s. So, if a business intends to develop a next-generation analytics function, it cannot ignore the past. They must build a bridge to go from Excel to Power BI and back again. Modern Excel Analyst in a Day covers this ground.

Paginated Reports in a Day

Paginated Reports in a Day will cover the specialized ground of producing pixel perfect reports vs dashboards. Dashboards, by their nature, are explorational and highly interactive. By comparison, reports are static, and produced for distribution to a larger segment of an organization. This hands-on class is also free, and covers all the uses, functionality, and execution involved in the static reporting side of the Azure Analytics Stack.

Analytics in a Day

Analytics in a Day will provide an overview of the entire Azure Analytics stack. Many people view the simplicity of great dashboards and do not realize they are only seeing the tip of the iceberg. However, the truth is the Azure Analytics stack includes many highly coordinated elements to position an organization to mine, clean, enrich, analyze, and visualize their data. Great organizations must orchestrate all of these activities to compete and profit with data. Consequently, an understanding within an organization of the entire stack is required for world class analytics. This one dy workshop provides such an understanding.

Why Azure Analytics Trainings Matter

Azure Analytics Stack Trainings are worth the one day commitments of your time. As Azure crushes its cloud competition, they prepare all members of your organization to enter the next generation of analytics. Furthermore, on the dimensions of cost, security, scalability and feature functionality, the Azure Analytics Stack is outracing all comers. So, with free training targeted at all participants in your business, these programs are a must for bringing data literacy and world class business intelligence to your entire organization.

About OmniData

OmniData provides products and services at every phase of the data lifecycle. When you need us, we are passionate about your success. We mine your hidden data assets and we will accelerate your time to data insights 10X. In addition, we are Microsoft Gold Partners committed to keeping you on the cutting edge of the Azure Stack

Gold Partner logo on white

The post Free Excel, Power BI, and Azure Training Offered by Microsoft appeared first on OmniData.

]]>
Analytics in a Day Workshops Scheduled https://omnidata.com/analytics-in-a-day-scheduled/?utm_source=rss&utm_medium=rss&utm_campaign=analytics-in-a-day-scheduled Mon, 22 Jun 2020 22:55:19 +0000 https://omnidata.com/?p=1634 This is the knowledge you gain

The post Analytics in a Day Workshops Scheduled appeared first on OmniData.

]]>
Azure Analytics Classes

OmniData joins Microsoft in offering Free Analytics in a Day Workshop Sessions

Tobias Eld, OmniData partner and Azure guru is certified for Microsoft’s Analytics in a Day Workshops Program. He will guide ambitious clients through setting a best practices foundation for analytics in Azure. If this is you, schedule to visit us in September, October, or November. “This is an ideal session for clients seeking an overview of where Azure is now and where it is headed. You will come away with an understanding of the direction your business should take to build a BI program in the Azure cloud,” Mr Eld said. “We’re excited we can provide this overview with the lab support of Microsoft.”

Microsoft trained top Azure Data Gold partners for this program earlier in 2020. Mr Eld was among the first Senior Consultants to receive certification. To sign up for a free Analytics in a Day Workshop, click here.

The Azure Platform, Azure Synapse, and Power BI are Microsoft products that combine for the most powerful next generation Business Intelligence platform there is. We will share with you Microsoft and OmniData’s vision of Limitless Analytics.

What You’ll Learn

Why participate?
At the end of an Analytics in a day workshop, you will have learned more about:

  • Collect all your structured and unstructured data using Azure Data Factory and Azure Blob Storage.
  • Use Databricks to prepare, aggregate and summarize data both in batch and in streaming.
  • Use the connectors between Azure Databricks and Azure SQL Data Warehouse to access and move data to scale
  • Generate reports using Power BI with Azure SQL DW data
  • Understand all facets of Azure Synapse Analytics
  • Started on the journey to Azure Analytics Excellence

About OmniData

OmniData provides products and services at every phase of the data lifecycle. When you need us, we are passionate about your success. We mine your hidden data assets and we will accelerate your time to data insights 10X. Our Analytics in a Day Workshops are taught by our top consultants. Thus we add value to this already valuable Microsoft program.

The post Analytics in a Day Workshops Scheduled appeared first on OmniData.

]]>
Data Warehouse Automation Optimizes Govt Response to Covid-19 https://omnidata.com/data-warehouse-automation-optimizes-govt-response-to-covid-19/?utm_source=rss&utm_medium=rss&utm_campaign=data-warehouse-automation-optimizes-govt-response-to-covid-19 Tue, 12 May 2020 01:10:27 +0000 https://omnidata.com/?p=1076 Part 2 – Data Warehouse Automation is the next logical step for stability and growth https://omnidata.com/about#teamThis article on Covid-19 Data Warehouse Automation was written by Douglas Textor with Susan Pessemier. In Part One, you read about a rapid prototype built in Washington State for rapid response to COVID-19. Government officials, right up to Governor Inslee,…

The post Data Warehouse Automation Optimizes Govt Response to Covid-19 appeared first on OmniData.

]]>
Part 2 – Data Warehouse Automation is the next logical step for stability and growth

https://omnidata.com/about#teamThis article on Covid-19 Data Warehouse Automation was written by Douglas Textor with Susan Pessemier.

In Part One, you read about a rapid prototype built in Washington State for rapid response to COVID-19. Government officials, right up to Governor Inslee, use this information to make agile decisions and to inform the public. The dashboards are available to the general public here. The dashboards update at varying intervals, as the data becomes available. Of note, the Data Architecture is ad hoc, a result of rapid response. This article make the case that Data Warehouse Automation is the next logical deployment.

Washington, with The HealthTech Community Response Council, has produced nothing short of an Information Technology miracle with their current prototype. It is “best in class” as far as state and even national governments go. However, if Washington continues down an ad hoc path, their best in class response may slow dramatically. The results could slow due to a lack of supportability and scalability. The recommendation is to deploy Data Warehouse Automation (DWA). DWA is a required next step to stabilize and grow the response platform.

Current State COVID-19 Response – Before Data Warehouse Automation

Illustration of Data Architecture, including challenges in current Washington State COVID-19 information system prototype.
Current COVID-19 Data Response Environment in Washington State. Best of breed, but lots of potential points of failure going forward.

After Data Warehouse Automation

Data Warehouse Automation applied to Washington State pandemic information system.
A DWA Tool addresses the needs for speed, reliability, scalability, supportability and simplicity. Nothing could highlight the usefulness better than a public crisis.

I have made a choice to keep this simple. If I had ranged beyond my own subject, the scope of this article would have been too broad. Consequently, it would have been far too technical to be helpful, traversing subjects like epidemiology, data science, and massive logistics. So, the subject is only the basics of information technology development and adoption. That means it involves people, data, systems, and new developments. The goals are speed, reliability, agility, and simplicity.

Here are some deep dive links to the experts in the related areas. The needs brought on by a crisis change daily. Later, in the aftermath, new requirements arise with the same regularity. For a deep dive into the stages of crisis management, read here, from the The Tohoku Journal of Experimental Medicine. This picture, from the article, summarizes the complexity and need for information and speed in disaster recovery.

Framework for phases of Health Crisis Management
Framework by Frederick M. Burkle Jr., from The Tohoku Journal of Experimental Medicine

Need for Speed

The virus is changing and our understanding of how to deal with it changes daily, too. Mistakes cost a lot of lives. There isn’t much more to say on the subject of speed.

Unstructured Data

Unstructured data includes any data stored in a computer for one purpose that needs to be applied to a new purpose. It also includes people trying to gather data into scrawled lists and spreadsheets. In order to prepare the data for consumption, typically a patchwork of scripts (code) emerge to structure, organize, and translate the data for a new purpose. All the while the data is making its way to a data warehouse that allows the dashboards to actually work. The first graphic above, “before” illustrates the current state of the unstructured COVID-19 response data.

My first impulse was to map all of the data sources in the state that flow into the state dashboards, using the well documented descriptions on the state web site that lend credibility to the graphs. This is a normal exercise when stepping into a Data Warehouse Automation project. Had I continued, it may have taken me days to simply get a draft, with no verification and certainly no real details. Instead, I counted over twenty five different sources of the data when I surveyed the Washington State site dashboards. Each source has varying levels of speed, availability, human intervention, staffing and thus reliability.

Partial list of Pandemic Response Data Sources

  1. All Hospitals – New Covid Hospitalizations, Deaths, Releases, Status at release, etc.
  2. Local Health Jurisdiction Websites – The data can possibly be automatically “scraped”, but is more likely currently read and manually input by a person.
  3. Tracked Illnesses – This data is very incomplete, as testing has been slow in the U.S. to get to appropriate levels. However, as the data becomes more complete, this measure will become useful to help those who may have been exposed.
  4. Demographics – Requires the basic data becomes richer and more enhanced, tagged with age, sex, other medical complications, etc.

The list goes on, and the more the list goes on, the greater the need for a Data Warehouse Automation tool, assuming the intention is to provide decision makers and the public with better information.

Re-purposed data

An example of re-purposed data might be hospital admissions. Each hospital may keep admissions data for everything from tracking patient progress and movement, to providing insurance information. So, yes, the data exists in a system. New demands require this data be provided twice a day to the state system. There is suddenly a different purpose for the data. Consequently, it may not be as easy to provide the information in a form that the pandemic systems can easily consume. This means very busy people suddenly need to shift gears and figure out how to provide the data. Multiply that problem by each of the 115 hospitals in the state and it is easy to understand the complications that arise exponentially on this one example.

Why Data Warehouse Automation?

Here is an explanation of Data Warehouse Automation for lay people. Under ordinary circumstances, unstructured data exists all over. “Unstructured Data” is defined as people trying to gather data into scrawled lists and spreadsheets, any data stored in a computer that is meant for one thing and now needs to be applied to a new purpose.

In order to prepare the data for consumption, typically a patchwork of scripts (code) emerge to structure, organize, and translate the data for a new purpose. These “scripts” are the product of highly skilled and expensive computer programmers. As each computer script is written, the benefit is that the data gets to where it needs to go. However, with a proliferation of computer scripts, a high cost for maintaining the scripts emerges. If an expensive script “breaks” at a critical time when a decision-maker needs the information, the costs can be high. In addition, the time and money required to hunt down the faulty script raises maintenance costs exponentially unless the computer scripts are effectively organized. Data Warehouse Automation tools organize all of these computer scripts in one place, improving the ability to maintain and expand on the information solution.

Expansion of the issues that DWA addresses from the first graphic above

Data Warehouse Automation dramatically improves insights

All the while the data is making its way to a data warehouse that allows the dashboards on the right of the diagram to actually work. See the second graphic above in this article for a picture of DWA applied to Washington’s systems. Note in the second graphic, all of the patchwork of scripts are replaced by the DWA tool. On the left, the available data is still chaotic. As we move to the right in the graphic, the organized data warehouse is much fuller than in the first graphic. The availability of information, represented by the dashboards on the right has increased dramatically. The entire ability to respond to the pandemic with information has improved on the dimensions of speed, reliability, scalability, supportability and simplicity.

For a deeper dive into the technical aspects of Data Warehouse Automation, Eckersen consulting offered this last year. At OmniData, we like TimeXtender and chose them as our software provider of choice in DWA. Their software does not break the bank, it has been in the market for years and it simplifies the complex. TimeXtender expresses their value in simple terms: Position yourself to save 10X in time to deploy your data warehouse. I would also highlight the on-going positives for operational maintenance and support. It is a terrific tool to prepare the data for display using Microsoft’s Power BI analytics and visualization platform.

Next steps for Washington and other states

For Washington, the obvious next step is to adopt a Data Warehouse Automation tool and keep going. The leadership in Washington defined their key information requirement very well. The information they have fuels decisions and helps communicate to the public at large. With the help of the private companies of The HealthTech Community Response Council, Washington has created a best in class Pandemic Response Platform. In conclusion, a DWA will make the system, faster, more reliable, supportable, scalable, and agile.

For other states, these articles will provide a foundation to copy Washington. Because, a pandemic is no time for NIH (not invented here) behavior. The results in Washington State will help those who are risk averse in Information Technology believe that they too can achieve the same level of success. Where the goal is to build better emergency response systems, the need is clear. Washington’s example is inspiring and will be a catalyst for others’ success.

About OmniData

OmniData provides products and services at every phase of the data lifecycle. When you need us, we are passionate about your success. We mine your hidden data assets and we will accelerate your time to data insights 10X.

Microsoft Gold Data Partners

The post Data Warehouse Automation Optimizes Govt Response to Covid-19 appeared first on OmniData.

]]>
Data Analytics in the Covid-19 Trenches https://omnidata.com/covid-19-data-analytics/?utm_source=rss&utm_medium=rss&utm_campaign=covid-19-data-analytics Thu, 07 May 2020 17:44:21 +0000 https://omnidata.com/?p=1046 How Washington State’s Pandemic Insights Got a Jump-Start. This article on Covid-19 Data Analytics was written by Douglas Textor with Susan Pessemier. Washington State Pandemic Dashboard Good, fast decisions require good information, and when the pandemic struck, the data in Washington State was in too many inaccessible places.  Remarkably, the state’s initial decisions, based on incomplete…

The post Data Analytics in the Covid-19 Trenches appeared first on OmniData.

]]>

How Washington State’s Pandemic Insights Got a Jump-Start.

This article on Covid-19 Data Analytics was written by Douglas Textor with Susan Pessemier.

Data Visualization of Pandemic in Washington State
Washington State Pandemic Dashboard

Good, fast decisions require good information, and when the pandemic struck, the data in Washington State was in too many inaccessible places.  Remarkably, the state’s initial decisions, based on incomplete information, were very good. But with the Covid-19 virus’s virility, the decision requirements still remain after many months and the high stakes change daily, if not hourly.   How did the information required to manage the pandemic appear on short notice?  First you need to gather the data.

If you are in the business of data analytics, or converting data to information, you knew right away when the pandemic started, there would be severe information problems.  Add one plus two.  1) Officials need information on how the virus was moving.  2) That information is in too many different places.

More Questions than Answers

Consider questions such as: Where is emergency response data? Or another question, who knows the rate of hospital admissions at private hospitals? In addition, where is nursing home data? Is there really a morgue or just a bunch of private mortuaries? On and on, the questions fly at leaders and the answers are not readily available.

Importantly, since the pandemic started, new questions arise each day. In the business of creating knowledge from data, fresh answers lead to the next questions. What is the availability status of PPE (personal protective equipment) for front line health workers. What about folks who died at home and were never tested? Where are the hot spots today and where will they be tomorrow?

You can find answers to some of the questions with some digging, research, and a lot of phone calls. However, can you get the answers updated with accuracy four times a day, satisfying the equivalent of war time decision making?

Of course, OmniData and TimeXtender often solve these types of problems.  We see each instance as ungoverned data from a lot of sources and our job is to structure all of that data in a useable way. We structure the data to provide knowledge, so leaders can make good decisions.  The knowledge and information are best displayed in maps and data visualizations, once the data is gathered and structured. Notably, we rarely see the need for speed that we have seen here.

TimeXtender Became a Catalyst for Covid-19 Data Analytics

Susan Pessemier of TimeXtender happens to be OmniData’s partner manager. It is through her efforts that I have had a front row seat to the developments in Washington State that have led, in record time, to State Governor Jay Inslee getting daily reports and to the State Department of Health providing the following interactive dashboard on their web site.

Governor Inslee believes data drives good decisions

Here is how this came to be in early March, 2020. Susan's husband has many years in Emergency and Disaster Management and saw how TimeXtender's technology could improve the data and information flow required to respond during a disaster. TimeXtender leadership agreed to offer their assistance in the form of free software and services to help.

Quickly, a council was formed. The HealthTech Community Response Council included Microsoft, TimeXtender, and the Washington Technology Industry Association. The purpose of the Council was to voluntarily serve the needs of the state, municipalities and public health organizations. Key leaders from the state agencies were eager to attend the initial forum, share their needs and concerns about what data they needed, and from whom. In the opening minutes of the first forum meeting, they shared their lack of ability to stand up new solutions without outside help. The DOH exists for public health, not reporting data. The pandemic is far larger than any single entity and requires coordination of many entities, along with cooperation from the public. Needless to say, the public is more cooperative when they get clarity on circumstances. Clarity comes from simple, relatable, powerful measures, charts and graphs.

Testing information across Washington State

Offers of Help - Difficulty Consuming the Help

From the start, there was an overwhelming response and offerings of help from the HealthTech Community Response Council for Washington's health providers. At first, no one could really explain exactly what they needed, or how they would get the data, reflecting the mysterious nature of the virus itself. Because of the urgency at hand, the agencies weren't shy to express their confusion on how they could consume all of the offers of help. Again, health agencies and hospitals are staffed for health-related activities and NOT for information technology.

That was the crux of the matter, isolating the needs, organizing the offers of help and the resources available. It quickly became clear, this was going to have to happen with collective leadership and individuals working outside their comfort zones and beyond their normal jobs.

Speed and Results

Clarity on a few things here:

1). One month ago, no one could have built the dashboards displayed on the DOH website. The data was in too many different places, controlled by too many different people.

2). No one individual or entity can take credit for the creation of the dashboards.  For instance, Microsoft has offered extensive infrastructure resources, development services, and guidance.  TimeXtender has participated in the organization and planning. Going forward, it's my opinion that TimeXtender's products will emerge as the best available to formalize and automate the data collection, governance and architecture.   Washington DOH has, naturally participated in the organization and governance.

3). On the ground, each hospital has suddenly risen to the task of providing the data. The pandemic forced staff to suddenly start collecting this data. The information may never have been reported this way, until now.  Going forward, data reporting is going to become a requirement in places it was not only a short time ago.

From Prototype to Production Ready

The result, from the link above and in the graphic provided on the article, is a powerful example of what Microsoft's Power BI platform can produce in a short period of time. However, the result functions as a prototype, meaning that sometimes it breaks and sometimes the data is not completely up to date. For full reliability, speed of updating, and expansion of capability, a new set of skills and tools will need to be applied to the problem. Those skills and tools are available from The HealthTech Community Response Council members. Unfortunately, the health agencies and hospitals have stretched past their abilities to grow in the information technology business. They are busy saving lives.

By any measure, the results for Washington's Covid-19 data analytics since that first meeting in March get high grades depending on who you are asking.  Perfectionists and lay people might say, “So what?  Just do it. Just make the computer do it."  However, folks with experience in data analytics technology development will see this another way.  If you study leadership and team building around information projects, the results are impressive. The team deserves to be commended.

Certain important aspects update once per week. Automation would help this.

The Washington State DOH dashboard only scratches the surface of the information that can be valuable to manage the pandemic. Once the pandemic subsides, other day to day health policy and government support requirements will emerge. We'll continue to report here on what comes next.

Rapid Prototyping is here to stay - Data Automation must follow

The result was produced while the team formed.  Folks with a variety of skills climbed steep learning curves to understand the goals. Many participated well beyond the time they had available and outside their skill comfort zones. If you are a veteran of information technology, data analytics and digital transformation, some of you may be cringing, because, OMG a million things can go wrong from here.  By the way, the website crashed a couple of times while I was writing this today. The data when I last checked was 36 hours old.

Others, like myself, that love to see speed in innovation, are cheering.  This may never have happened without the gravity of a pandemic. Primarily, state and private entities don't have the knowledge, skillsets, or time to innovate past their primary objectives. However, the right information technology tools and the will of a team can accomplish great things.

What Needs to Happen Now for Washington's Covid-19 Data Analytics?

New questions emerge daily. The rapidity of the Covid-19 data analytics solution turnaround means it may be frail. Staffing to provide on-going data may be too stretched. Leaders will need to understand that frailty and find a way to bring resources to make information flow more dependable. Our follow-up article, Part 2, will outline in detail next steps toward reliability, growth and sustainability of this fine Data Architecture and Data Analytics effort.

Notes:  Neither OmniData, nor TimeXtender are attempting to take credit for the effort described in this article.  Thus far, OmniData has only witnessed from the sidelines.  This article celebrates rapid prototyping in a crisis. And, of course, it shows the value of Data Analytics in difficult decision-making.

About OmniData

OmniData provides products and services at every phase of the data lifecycle. When you need us, we are passionate about your success. We mine your hidden data assets and we will accelerate your time to data insights 10X.

Microsoft Gold Data Partners

The post Data Analytics in the Covid-19 Trenches appeared first on OmniData.

]]>
OmniData Remote Solutions https://omnidata.com/omnidata-remote-solutions/?utm_source=rss&utm_medium=rss&utm_campaign=omnidata-remote-solutions Fri, 20 Mar 2020 23:35:13 +0000 https://omnidata.com/?p=707 Maintaining Productivity during COVID-19 The Coronavirus/COVID-19 is now having an impact on every family in the United States. We empathize that you and your family are impacted, and most of your businesses are as well. For many of our clients we are implementing urgent measures to support the workforce working from home, as well as immediate changes to…

The post OmniData Remote Solutions appeared first on OmniData.

]]>

Maintaining Productivity during COVID-19

The Coronavirus/COVID-19 is now having an impact on every family in the United States. We empathize that you and your family are impacted, and most of your businesses are as well. For many of our clients we are implementing urgent measures to support the workforce working from home, as well as immediate changes to reporting and dashboards responding to the changes in customer behavior that you are seeing.  

Here at OmniData, we are all working from home, and we are now delivering analysis sessions remotely using Microsoft Teams and Microsoft Whiteboard. We have implemented a new onboarding process for new employees to enable them to get trained and start delivering value for your businesses without coming to the office once.  

We wanted to share with you an overview of the ways that we are changing our delivery methodology to ensure that we continue to deliver value to our clients. 

Committed to your success at all times

Download printable pdf here

Please do not hesitate to reach out to us at this time, we are here and committed to helping you adjust to the new reality of doing business under lockdown. 

The OmniData Executive Team 

Doug Textor, Dan Erasmus, Tobias Eld 

The post OmniData Remote Solutions appeared first on OmniData.

]]>
Best Cloud Provider: Microsoft With Azure Synapse https://omnidata.com/best-cloud-provider-microsoft-with-azure-synapse/?utm_source=rss&utm_medium=rss&utm_campaign=best-cloud-provider-microsoft-with-azure-synapse Wed, 20 Nov 2019 01:44:02 +0000 https://omnidata.com/?p=429 Ignite Conference Announcements Advance Microsoft Azure on Many Dimensions Microsoft Azure is absolutely now the best cloud provider. In cloud scalability, Microsoft has pulled far ahead of Amazon Redshift, Google BigQuery, and Snowflake DW. The crucial measures are speed, administration, supportability and cost. Check out Microsoft’s demo of Azure Synapse from the 2019 Ignite conference. TimeXtender With…

The post Best Cloud Provider: Microsoft With Azure Synapse appeared first on OmniData.

]]>
Ignite Conference Announcements Advance Microsoft Azure on Many Dimensions
Microsoft Azure emerges as best cloud provider
Microsoft Azure emerges as best cloud provider

Microsoft Azure is absolutely now the best cloud provider. In cloud scalability, Microsoft has pulled far ahead of Amazon Redshift, Google BigQuery, and Snowflake DW. The crucial measures are speed, administration, supportability and cost. Check out Microsoft’s demo of Azure Synapse from the 2019 Ignite conference.

TimeXtender With Azure extends Microsoft’s status as Best Cloud Provider

TimeXtender supports the new Azure releases from Microsoft, because Azure Synapse directly builds from Azure Data Warehouse. Discovery Hub is part of Microsoft’s Reference Architecture for Cloud Scale Analytics. Read more about this here. This foresight is a great example of why smart choices matter in IT investments. Your own investments in IT matter, because of the way your Modern Data Estate scales on foundational elements like compatibility, supportability, speed, administration and cost. These elements compound as you invest in architectural element.

Joseph Treadwell, TimeXtender’s Solution Specialist Director in North America, states “Microsoft’s Cloud Scale Analytics architecture is the most robust data management solution on the market.  But businesses are still hand-coding rules & models using several different native languages. [Even with its inherent advantages] This creates a brittle solution, that requires an expensive set of hard-to-find skills, and is slow to adapt to business demands.” TimeXtender’s Discovery Hub orchestrates form every source across any cloud or on=prem. Your enterprise’s unique business rules are built straight into your entire cloud architecture, and your single source of truth, your enterprise data warehouse, can be updated from infinite sources in the cloud and on premise. Voila, Modern Data Estate, with unlimited scale and fully automated.

Deeper Dive Azure Synapse Benchmark Proves Status as Best Cloud Provider

Every benchmark of Azure’s offering bears out Microsoft’s clear leadership. Read the results of Gigaom’s Cloud Benchmark, a reliable, objective third party. When you understand that the competition is lagging in simple benchmarks, and you understand that these lags compound as your environment grows, there is only one logical conclusion. Microsoft emerges as the obvious best cloud provider. Invest alongside Azure and TimeXtender.

The Admin Angle – Azure Arc

Azure Arc , another Microsoft announcement ath the Ignite conference, allows administrators to CONSISTENTLY manage resources anywhere, including AWS and Google Big Query. Clearly Microsoft understands the diversified nature of IT organizations and is doing their best to accommodate. Azure Arc is not a “thin wrapper” or marketing ploy. Taking a deep dive, we discover that databases, Kubernetes clusters, and servers sprawling across on-premises, edge and multicloud environments are all supported through one clean administrative layer delivered with top quality and supportability by Microsoft. This allows organizations to take their investment in Azure on a realistic, stepwise basis, ultimately building their Modern Data Estate. Walk one step at a time toward adopting the best cloud provider.

Salesforce Builds With Azure

Supporting this article’s premise, Salesforce just announced a major foundational investment in Azure. Time to follow the smart investors. Microsoft Azure will only get better. Maybe their competition can OEM the product and catch up. Microsoft is the obvious best cloud provider.

About OmniData Insights

OmniData Insights provides products and services at every phase of the data lifecycle. We exist to cultivate your Data Estate, and to spread data literacy to every person in your organization.  Call on us for any advice on your Modern Data Estate or next-generation BI efforts.  We’re here to help, whether it’s a live POC, or just pointing you in the right direction. Find out how we can accelerate your time to data insights 10X.

OmniData Insights Gold Microsoft Partner

The post Best Cloud Provider: Microsoft With Azure Synapse appeared first on OmniData.

]]>
Migrate From SQL 2008 Before Support Ends https://omnidata.com/migrate-from-sql-2008-before-support-ends/?utm_source=rss&utm_medium=rss&utm_campaign=migrate-from-sql-2008-before-support-ends Tue, 02 Jul 2019 01:48:48 +0000 https://omnidata.com/?p=238 Migrate from SQL 2008 at 10x the speed and reliability using TimeXtender. Microsoft is ending support for SQL Server 2008 on July 9, 2019. With the end of support, many organizations are struggling to migrate from SQL Server 2008 sources and data warehouse(s) to a Modern Data Warehouse. However, TimeXtender Discovery hub can accelerate this…

The post Migrate From SQL 2008 Before Support Ends appeared first on OmniData.

]]>
Migrate from SQL 2008 at 10x the speed and reliability using TimeXtender.

Microsoft is ending support for SQL Server 2008 on July 9, 2019. With the end of support, many organizations are struggling to migrate from SQL Server 2008 sources and data warehouse(s) to a Modern Data Warehouse. However, TimeXtender Discovery hub can accelerate this process by 10x.

For a fast demo, check out this video from TimeXtender. In addition, the video also gives a great look at the new user interface that Tobias Eld wrote about here. Or, for a description of features, download the overview .pdf here.

More Databases Increase the Necessity for Data Warehouse Automation

If you are using SQL Server with software applications, your software vendor may provide tools to migrate your data to these newer databases, but if you are running one or more data warehouses, data marts or reporting environments – which were built by pulling data from transactional databases – you will most likely have to manually rebuild the environment. But writing the code to recreate your data warehouse is extremely time-consuming and expensive. 

Discovery Hub, from TimeXtender, provides a toolset for capturing the structure of your current data warehouse and automates the creation and updating of a new data warehouse that can be built on any Microsoft data platform. 

About OmniData Insights

OmniData Insights provides products and services at every phase of the data lifecycle. We exist to cultivate your Data Estate, and to spread data literacy to every person in your organization. Call on us for any advice on your Data Estate or next-generation BI efforts.  We’re here to help, whether it’s a live POC, or just pointing you in the right direction. Find out how we can accelerate your time to data insights 10X.

OmniData Insights Gold Microsoft Partner

The post Migrate From SQL 2008 Before Support Ends appeared first on OmniData.

]]>
June 2019 Discovery Hub Release https://omnidata.com/june-2019-discovery-hub-release/?utm_source=rss&utm_medium=rss&utm_campaign=june-2019-discovery-hub-release Thu, 27 Jun 2019 22:19:02 +0000 https://omnidata.com/?p=225 This week, TimeXtender released a new version of Discovery Hub, I have been playing around with the preview for a last several weeks, so wanted to share my impression. As far as quality-of-life features for the Discovery Hub developer goes, this is a huge release. If you haven’t upgraded for a while, now is a…

The post June 2019 Discovery Hub Release appeared first on OmniData.

]]>
This week, TimeXtender released a new version of Discovery Hub, I have been playing around with the preview for a last several weeks, so wanted to share my impression. As far as quality-of-life features for the Discovery Hub developer goes, this is a huge release. If you haven’t upgraded for a while, now is a good time to consider doing so.

Revamped User Interface

I am so excited that we now have a new User Interface to work with. This has been something TimeXtender has wanted to do for a very long time, I remember seeing a prototype back in early 2013.

There are now three permanent components in the UI, to the very left we have the solution explorer. To the very right is the data movement pane, which will be empty when no object with a data movement or transformation context is selected. Most importantly, in the center is the main working area, where you can open any number of tabs, grouping them as needed.

This new UI utilizes the full screen, giving a much better flow in the work by dragging elements between tabs, I generally set it up from the left to the right. If you are not used to doing it, now is the time to start using Discovery hub in full screen.  

Support for Azure Data Lake Gen 2 and Azure Databricks

On the storage side, the ODX Server adds support for Azure Data Lake Store Gen 2 in addition to the existing support for SQL on-premises, Azure SQL (MI) and Azure Data Lake Store Gen 1.

On the data movement side Azure Data Lake Analytics is replaced with Azure Databricks for handling advanced incremental load (updates and deletes), incremental to data warehouse and selection rules. Importantly, Databricks can now handle the transfer between Azure Data Lake Store and the Data Warehouse, reducing the need for the Discovery Hub application to directly interact with the data to only the initial load from source to ODX.

This is a significant move towards a fully managed PaaS experience with Discovery Hub.

Improved Upgrade Experience

This is a nice little bonus, when you install the new version and activate your license, you are prompted with a dialog that allows you to import settings from a previous version of Discovery Hub. The wizard also takes you through connecting to your ODX Server, upgrading the repository and reminds you to update the settings of the windows services. While we are waiting for a proper in-place upgrade this is a good step forward.

You still have to remember to configure the correct repository settings for your service user after the upgrade.

If you did not already upgrade to version 19.2.x, you will also be taken to the performance recommendations wizard once you open your first project. This is a great tool that helps make sure you take advantage of the performance improvements that have been released over the last many years. If you have been using Discovery Hub for many years and have not been very methodical about implementing the new performance features, this could give you a lot of potential gains.

Keep lookups up to date

A new feature was released in beta in version 18.10 (that is October 2018, by the way), allowing you to keep lookups up-to-date for tables that are incrementally loaded. This means that when you perform a lookup from a source table, the lookup value in the destination table would be updated even when the destination record did not change. This was achieved by clever use of hash values to detect when an update of the destination table was needed.

With the original implementation, certain changes in the source table could trigger an unnecessary full load of the destination table. The final implementation in this version has addressed this issue and slightly simplified the configuration (the entire row hash-key is gone, and so is the check-box for it – we won’t miss it).

This is a great feature that addresses an issue that is sometimes overlooked with annoying consequences down the road, and it performs well even with large amounts of data.

Other updates

I also want to plug a small change here as the battle for the right way to create lookup joins is finally over, the UI will now show the relationship used directly on the lookup, even when no explicit relationship is defined. See the screenshot.

Look at the full release notes for a few other changes:

https://support.timextender.com/hc/en-us/articles/360029846311

Download the new version (note the warning if you are using ODX Server already):

https://support.timextender.com/hc/en-us/articles/210439583-Download-Discovery-Hub

Reach out to me if you want to talk about a plan to upgrade to the new version.

-Tobias Eld

The post June 2019 Discovery Hub Release appeared first on OmniData.

]]>