Monday, September 26, 2016

Why no, Tableau, I don't want you reading my mail.

I was excited about being able to use Tableau with my Google sheets' data. The ability to connect directly to them with Tableau 10 was really appealing.

My enthusiasm dissipated quickly when this showed up:

Really, Tableau?
You want to view my email address and basic profile info?

What possible reason could there be for needing this information to establish a connection to my Google sheets? I can't think of a single one.

Whose idea was this?

Frankly, the whole thing is offensive, and I'm very put out.

Monday, July 11, 2016

On the virtues of simpler and easier.

"Creeping featurism is a disease, fatal if not treated promptly. There are some cures, but, as usual, the best approach is to practice preventative medicine."

— Donald Norman, The Design of Everyday Things 2002 Basic Books edition ISBN 0465067107 Ch. 6, p. 173.

"The best software for data analysis is the software you forget you're using. It's such a natural extension of your thinking process that you can use it without thinking about the mechanics."

Stephen Few on Data Visualization: 8 Core Principle

Tuesday, May 31, 2016

Please stop spamming the comments.

To those of you who submit comments with zero content other than soliciting for your Tableau training, consulting, or other commercial purposes: you might as well stop.

Your comments will not be published.

I find it offensive that you're spamming the comments, contributing nothing to the topic at hand but looking to advertise your wares.

It offends me personally in that I need to spend time attending to your crass, pushy, rudeness.

It offends those of us who have put in the time, energy, and effort to become competent professionals. Your pushing the idea that simply taking a short Tableau training course, of dubious quality, is enough to get someone to pay you to work with it is at best naive, misleading, and reflective of the race to the bottom that values a smear of exposure to real competence.

Examples of comments I will not publish:

Nice Article !!! Thanks for sharing with us !!! Visit - http://[spammer 1].in/

http://www.[spammer 2].com[...]tableau-online-training-in-[...]/

great post i read from start to end its awesome, keep on writing more about TABLEAU. we are one of the leading TABLEAU online trainers.. Visit - http://[spammer 1].in/

your blog is such a wonderful and nice library which filled with lot of technical stuff please share with us http://www.[spammer 3].com/tableau-training.html

Nice Article Thanks for sharing with us !!! Visit - http://[spammer 1].in/

This is a Good blog. Thank you for your very useful information. I appreciate that you looked it up to share with us all!.... Tableau online training on

Tuesday, May 3, 2016

Dashboard Improvement Opportunities - Surface Observations

Tableau Dashboards need improvements.

I've been beating the drum for improvements to Tableau's dashboards ever since they were introduced. As a way to get more than one worksheet into the same visual space they were adequate, and they still work OK/tolerably/nottoobad/betterthannothing, etc. for those situations where you only want to double-click 2, 3, or 4 worksheets and have Tableau put them into a grid-based layout.

But this isn't good enough.

Over the years I've been using Tableau there's been far too much time consumed with fiddling, faking, fooling, and futzing around with dashboards. Time that takes away from the real value of helping people discover, understand, and communicate useful and valuable information in the data that matters to them.

Yes, Tableau has made improvements to Dashboards and dashboard creation and maintenance. And some of these are really welcome, as far as they go. For example, the introduction of the Layout control in the Dashboard Window was a half-quantum leap forward. For the first time we could see, without use of external tools, how the elements in a Dashboard related structurally to one another. But as good as it was, as much a leap, it was still only a half-step forward: the Layout control is so small, and it's selection management abilities so poor, that using it is an exercise in multi-click hell. And it's still easier to get a comprehensive view of Dashboard contents with external tools.

It was at this point of describing my frustrations with a colleague who was relatively new to Tableau, and didn't have any experience in more sophisticated tools, that she said, in effect: "well, I don't see what's wrong with it, so why don't you show me?"

Hence the following diagram. I created a simple, two-sheet dashboard and used it to illustrate some of the problems that jumped out. It's in no way comprehensive—I tried to keep it relatively simple.

The Tableau Public published version of this workbook is here.

A concrete example. Have you ever been wrestling with a dashboard, trying to get things nicely organized and arranged, only to have Tableau seemingly go insane, moving things around that you haven't touched, making it difficult to place and resize things that only a moment ago were all nice and tidy? The following example shows one of the things that can contribute to the chaos.

Multiplying Containers.

Start with a newly created dashboard.

Then click "Show Title" 5 times,
i.e. issue this command sequence:

  1. Show Title
  2. Unshow Title
  3. Show Title
  4. Unshow Title
  5. Show Title

 

After all the Title Showing and Unshowing you should see that Tableau has gone ahead and created this container structure for you.

This is bad. But it gets worse—once a Dashboard is populated with actual (human) content Tableau will insert containers as you, the dashboard author, do your authoring. This can make for extremely messy situations.

Monday, January 11, 2016

Nuggets and Seeds

Preamble

This post presents a collection of thoughts covering some of the idea space I've developed over nearly ten years of using Tableau, largely framed within the context of its existence of a data analytical tool suitable for use in the modern enterprise.

There's no real intentional organization; although there are commonalities, themes, and overlaps, no narrative is intended. Some of the nuggets and seeds have been fleshed out to some degree in working notes, but not to the point where I'm comfortable publishing them. Part of the purpose of listing them here and now is to experiment, get them out and see if it helps stimulate generating a broader synthesis.

One of my motivations is that I've become increasingly concerned with the direction the field I've made my profession is taking. At heart I believe that data analysis should be part of everyone's intellectual toolbox, that being able to explore the data relevant to one's area of interest provides the opportunity to achieve a deeper and richer understanding of the state of things. We use tools to augment our intellect—our knowledge and expectations of our environments. Data analytical tools are, or should be, like other tools: providing useful functional features that maximize their usefulness while minimizing the concessions the person using it must make.


What's the Point? A Brief History of Computer-Assisted Data Analysis

– FORTRAN, COBOL, 4GLs (e.g. FOCUS, RAMIS, Nomad), {the dark ages}, ... Tableau, ...?
– terminals, line printers, GUIs, touch-sensitive surfaces, ?

Data Analysis is Not Just Visualization or,
Visualization Is (Only) The Thin Outer Layer of Analysis

"Visualization" is the currently popular term, used far and wide as the tag for the new wave of tools, technologies, and activities that provide the means to access and present data in a form that people can interpret and derive information from.

But it's misleading, a gross simplification that ignores the primary role context plays in forming and communicating data's information value.

"123.45" is a data visualization

just as much as is this bar, and "123.45" is more effective in communicating the quantity to boot

decimal notation using the Arabic-Hindu numerals is in this context a time-proven, highly effective method of visually representing numeric quantities to an arbitrary level of precision in a small space

thinking that quantitative data visualization is restricted to geometric forms is a handicap

Decision-Making Benefits From Analysis of Available Data Data Analysis is (or should be) a Cognitive and Intellectual Skill

supported as much as possible with tools and technologies that augment, not inhibit, or erect unnecessary barriers to human abilities

unfortunately, there are forces that continually work to shape data analysis as a primarily technical undertaking—these forces are to be resisted, but must be understood in order to be overcome

For Whom the Tool Toils

data analysis is for everyone, not just the executives at the top, or the line-of-business people on the surface

The Right Tool for the Job

there's no golden hammer

Why Only Tables?

Tableau can only access and analyze data in tabular form. (with the limited exception of cubes, which are infrequent targets for Tableau analysis, perhaps because Tableau's analytically retarded here)

To the best of my knowledge the decision to restrict analysis to tabular data has never been explained.
But, it's a severe limitation, and one that is difficult to understand.

Granted, at the time Tableau came into existence, and Polaris before it, organizing data had become the de facto norm. There are a lot or historical reasons for this, a discussion of them is beyond the scope of this post. But the truth is that table-based data organizing, even when Relational (and by far most real world systems aren't proper Relational models), is a terrible way to store data for human understanding (and I've had some people take real umbrage at this statement).

Historically, before tables became the norm, data was stored in structures that matched the information model the data was persisting information about.
Hierarchical databases were everywhere, and network databases weren't so rare as to be alien.
Even more relevant to this discussion, the 4GL data-analytical tools from the 1970s could understand these structures and analyze them correctly in context, providing the human-correct results.

In the present, the growth in the number of non-tabular data sources that people are interested in exceeds that of standard tabular sources. NoSQL, JSON, XML, YAML, and many other formats and structures have emerged and taken root.

Tableau's inability to recognize and analyze non-tabular data leaves a huge hole in the data analysis tool marketplace. Who's going to fill it?

No One Ever Got Fired For Quoting Gartner

Back in the way back, when IBM dominated the business automation universe, when mainframes ruled the roost the conventional wisdom held that "No one ever got fired for buying IBM".

There's a similar knee-jerk reflex conditioned into today's executive managers responsible for selecting strategic information technology for their organizations—they rely upon Gartner, particularly it's Magic Quadrant, and similar research firms to tell them which companies are positioned where in the marketplace. Many, too many, managers take these fora as trustworthy guides for their purchasing decisions, essentially abdicating some level of responsibility for conducting their own research into the technological space wherein may lie tools, products, etc that could be useful and valuable.
(the validity and shortcomings these market analyses is well documented and argued elsewhere)

Getting recognition in these fora is a huge leap up the food chain for vendors. Being identified as a viable product in the Magic Quadrant is a threshold event that exposes the vendor and product to the widest, deepest-pocketed audience/market in existence. Being recommended is a quantum leap forward. Which is all fine and good if one's trying to build one's company into the largest, most lucrative entity possible.

But.

Is it that what a truly innovative company interested in providing the best possible product that helps the greatest number of people should be striving for; to simply grow, and grow, and grow?

Just analyze it.

Many organizations that are trying to adopt Tableau as a BI tool or technology are going about it the wrong way. They're using Tableau in the traditional SDLC model, and are as a result missing the greatest part of the value Tableau offers.

Software isn't material, so there's little cost to trying something to see if it works. Experiments can occur almost at the speed of thought, dramatically closing the gap between conception and creation

The traditional approach of concept, analysis, design, build, deliver is mired in a model of work that's rooted in the industrial production paradigm that underlies modern business management theory and practice (I have a BBA and MBA). This model worked extremely well during the industrial age, when producing large numbers of physical goods was the organization's work. It's not only of little value in the non-physical world, it is actively detrimental to the good conduct of work that's predominately centered around fluid cognition and creation.

Tableau's Arc

Tableau has occupied a shifting position in the data analysis pantheon over the past decade. This is the story of my experience with it, the ways in which it's been employed, and its presence in the broader worlds in which its taken root.

Thinking About What Makes Good Data Analytical Tools looking to Bret Victor's Learnable Programming
...two thoughts about learning:
  • Programming is a way of thinking, not a rote skill. Learning about "for" loops is not learning to program, any more than learning about pencils is learning to draw.
  • People understand what they can see. If a programmer cannot see what a program is doing, she can't understand it.
Thus, the goals of a programming system should be:
  • to support and encourage powerful ways of thinking
  • to enable programmers to see and understand the execution of their programs

There's a lot to learn from these thoughts, directly relatable to using technology for data analysis. For example:

Data analysis is a way of thinking about the nature of things: their identities, quantities, measurements/metrics, and relationships to other things.

Learning about the technical properties of specific technical implementations of analytical operations is not learning about data analysis.

If people cannot see into the machine, it's very difficult to achieve a robust understanding of what the machine is doing, how it does it, how to imagine the things it can do, and how to set it up so that it does what one wants.

Design Rot: Tableau's Usability Has Stalled

For years, Apple followed user-centered design principles. Then something went wrong.
In their article How Apple Is Giving Design A Bad Name Don Norman and Bruce Tognazzini argue that Apple has abandoned the principles of usability in its product designs in favor of a beautiful aesthetic experience that hinders rather than helps users' ability to accomplish the things they want to.

In its own way, Tableau has similarly failed to continue to pursue the same elegant usability at its heart.

When Tableau appeared its design was revolutionary, providing simple, intuitive objects, and interactions with those objects, that surfaced data and fundamental data-analytical operations, very closely matching a the human view of a particular and useful analytical model.

This was Tableau's genius. For the first time people could -do- simple, basic data analysis with a tool that made it simple and easy.

Since then, Tableau hasn't followed through with its initial promise. The simple and easy things are still simple and easy, but pretty much everything else beyond this space is too complex, complicated, confusing, obscure, and unnecessarily hard to figure out.

The list of Tableau's design faux pas has become too large to catalog. At one time I had hopes that Tableau would recognize it had accumulated too much bad design and take steps to remedy the situation. I no longer believe this; Tableau appears to have invested so much into its current way of doing things, and reaped so many rewards for doing things the way it has, that there's no motivation or incentive for it to change its stripes.

Side Effects, Tips, Tricks, and Techniques: Useful Aids or (Un)Necessary Evils?

Does the need to learn, master, and employ these to accomplish useful analytical effects add or detract from Tableau's overall utility and value?

It's not a simple black or white situation, but there's an inverse relationship between the quantity and complexity of the technical things one needs to learn to accomplish useful things and overall utility. The more tricky things one needs to know, the harder the tool is to use from a human perspective, and the further from the primary goal one needs to work.

On the Consequences of the Exaltation of Complexity

what happens when mastery of arcane technical matters is elevated and praised above sense-making

Tableau's Salad Days

are the best behind us?

The Perils of Ossification

what happens when a tool freezes, welding into place aspects that could be improved through continual evolution

Suffering the Innovator's Dilemma
or,
The Rise and Fall of a Disruptive Innovator Beware the Cuckoo's Egg

considering the consequences when one tool pushes out others

Yes, You can do that in Tableau. So what?

just because it can be done, should it be?

Tableau is a terrific tool for accomplishing basic data analysis quickly and easily, and for communicating interesting findings, also quickly and easily.

There are, however, limitations in what Tableau does simply and easily, and more limitations in what Tableau can be coerced into doing. When faced with situations where needs fall outside Tableau's capabilities, or where the effort to satisfy the needs with Tableau exceeds the effort to satisfy them with another tool or technology, it's a good idea to at least entertain the notion that Tableau should not be used.

Velocity, Value, Volume Pervasive Data Analysis - a Promise As Yet Unfulfilled

it's been over thirty years since the idea of making analysis of one's own data possible and as seamless and easy as possible surfaced
there was a flourishing of the concept for a while; business people could analyze their data with minimal involvement and support from their data processing organizations, br /> but it didn't last

there was a decline
along with a dramatic increase in the size, wealth, and power of the database and BI platform vendors

a decade ago Tableau appeared, and made it possible for nontechnical people to access and conduct their own fundamental data analysis, achieving previously unthinkable insights immediately with little or no technical intervention or support—it was a revelation, and carried the hope that pervasive data analysis could become a reality

so... why haven't things progressed all that much in the ten years since?

Self-Service BI – it ain't what you think

Anecdote: Several years ago I was talking to a friend, a Senior Information Officer at The World Bank about Tableau's benefits, how it had the potential to change everything, describing how it could, if adopted effectively, be the path to helping 'ordinary' people obtain the insights and information from the data that mattered to them, much of which lay outside the boundaries of the Bank's institutional data hoard.

Her response was that they didn't need it, their needs were being satisfied through the self-service Business Objects environment they'd set up.
As it turned out, the BO solution didn't gain much traction.

Deliver Value Early and Often Do the Right First Thing First or,
Start With Data Analysis

it seems obvious, almost not not worth mentioning
but far too many BDA efforts ignore, or are ignorant of, the opportunity to start with data analysis, often in the mistaken and disastrous belief that data analysis is something that happens after precursor activities take place

You Can't Start With Everything if you try to, you'll never have anything
or,
The Big Bang Delivery Model is a Recipe for Failure Diminishing IV/E

the effort to achieve information value from data increases more than linearly,
or,
it gets harder and harder to obtain the next level of value from data, along multiple dimensions

Does Deep Tableau Expertise Lead to Diminished Value?

do the demands and difficulties involved in developing the skills necessary to wield Tableau successfully across a broad spectrum of analytical purposes and outcomes detract from the value that could be delivered with it?

Beware Re-branded Big BI

several years ago it wasn't uncommon for Tableau advocates to contrast Tableau to Big BI

Tableau was seen as the 'anti-BI', the human-oriented tool that would be an antidote to Big BI's ills

then Tableau gained traction, became better-known, then popular; it surfaced into the corporate executive mindspace through reviews including Gartner, Forrester, etc., and to some extent from the bottom up as people discovered its benefits and used it to good effect

once this happened things began to change

traditional vendors started to come out with data visualization components and tools as their "New!", "Improved!" offerings, trying to capitalize on the market Tableau had pioneered

the Data Warehousing folks, the very same ones who had, for almost two full decades, been preaching to the faithful of the need to devote themselves to building enterprise-spanning industrial-strength universal conformed data repositories and associated answer-all-questions analytical platforms, crashed the party declaring their allegiance with the new agile, adaptable BI world
but they were still selling their same old wares with a fresh coat of paint slapped on

Don't Struggle Alone with Your Data Analysis, Struggle with Tableau Tableau is Bleeding Contemplating Complexity's Consequences

some problems are inherently complex, but the means of addressing them should be no more complicated than necessary

Baroque is Broken

ornamentation and elaborate constructions can be superficially attractive but they are often at odds with, even detrimental to, real usefulness

Tableau's Data Blending: is it really a Good Thing?

legend has it that data blending was a hack by one of Tableau's developers; true or not it has the feel of one
hack or not, it's a very useful mechanism, and has been leveraged by very clever people to achieve all sorts of very useful analytical outcomes
but... it has limitations that leverage its usefulness in many real world situations—the question here is to what degree they render it irrelevant for real world purposes

Misalignment of Focus Analytical Types

– Curiosity vs Confirmation
– Explainers vs Confirmers
– Explorers vs Justifiers

Zombie BI

we thought Big BI was dead, or at least on life support, but it's showing signs of resurrecting

Enterprise Data Analysis is Fractal

self-similar at all scales

All Data is Valid or,
There's no bad data,

but much of it is misunderstood and/or it tells unpalatable truths.

Scaling Mount Simplicity

keeping things simple isn't easy, but it's worth striving for

Whither Data Analysis Tools? Rethinking the Data Warehouse

storing, safeguarding, and provisioning an enterprise's data isn't what it used to be
(and the traditional model didn't work all that well anyway)

Your Tableau Are Doomed Pursuit of Maximizing Market Potential Considered Harmful or,
Maximizing Market Potential, a Cautionary Tale The New Hope Fades Entropy in the Tableau Universe Clarity, Coherence, Completeness are Virtues

determining whether data analytical efforts are worth pursuing

Development is Best Served in Minimal Portions

determining whether data analytical efforts are worth pursuing

Development is the technical implementation of someone else's ideas

using a traditional SDLC as the default approach to providing actionable information to business decision makers is a very bad idea (unless you're a software vendor or otherwise benefit from the expenditure of too much time, money, attention, energy, and other resources)

Requiem for a Once Noble Tool

Once upon a time a very clever young man thought up a new and improved way for nontechnical people to analyze their data. This required a re-conception of "data analysis", moving away from the prevailing paradigm of technologically-centered programmatic creation of artifacts that, hopefully, conveyed useful information. The young man's innovation took a different approach. Tt provided mechanisms that tightly coupled the basic data analysis operations: field selection; data filtering; sorting; and aggregation, with intuitive system-presented data and analytical operation avatars familiar to 'real' (i.e. nontechnical) people. These mechanisms, when combined within the operational environment by these people, caused the system to generate and present the appropriate analytic.

This was a revolution, and it changed everything. For the first time people could interact with their data directly, without enlisting the assistance of other people with specialized technical skills, and do their own data analysis. For the first time there was the very real hope that things would continue to get better, that with the barrier to data analysis now breached the breadth, depth, and reach of human-centered data analysis tools would continue to blossom.

There were limitations, of course, as there always is in the first conception of a solution to a simple subset of a very large, intrinsically complex problem space. Data analysis is almost unbounded in it's full range, from the types of things that can reasonably be thought of as data, to the analyses that can be conducted. One rough analogy is to consider the initial tool as providing arithmetic functionality and the full potential space the full range of mathematical analysis, e.g. number theory, calculus, etc.

Sadly, the goodness failed to fulfill its promise.
It stalled out early.
Instead of continuing to expand the realm of easy, nontechnical, human-centered analysis into the analytical universe it expanded its functionality by adding technical, deeply mysterious features that left mere 'real' people on the outside looking in.

And in due time it became, first, something some people had some familiarity with, then one that some had heard of, then a ghost of a past that didn't matter to most people.

Wednesday, December 2, 2015

The Fallacy of The Canonical Dashboard(s)

I've once again come across an article promulgating the conventional wisdom that runs along the lines of: "Important information about the Dashboard (or two, or three) your business needs."
It's here: Why every business needs two dashboards for clear flying, and contains this passage:

The two dashboards every business needs

"But it actually isn’t enough to have just one dashboard; I believe every business needs two dashboards: strategic and operational. Like the cockpit instruments in a fighter jet, they allow the executive to know exactly where he or she is at any given time and focus on getting to the destination in one piece."

Putting aside the unfortunate, and by now antiquated, fighter jet cockpit metaphor, the article recognizes that one dashboard isn't enough. But it continues to promote the idea that there is a small set (in this case: two) dashboards that, if carefully considered, can provide the information decision makers need to run their business.

This is an anachronistic view of the world of business data analysis that doesn't recognize developments of the past decade that have moved beyond its limitations.

In the real world, any small set of canonical dashboards is limited in the information they can convey, and don't extend more than a step or two towards the horizon of useful information.

The idea that there's a limited view of one's information space that's adequate for monitoring and decision-making is rooted in historical factors. Briefly: because it took very substantial amounts of time, energy, money, and other resources required to create information delivery artifacts, e.g. dashboards, people became conditioned to the idea that there was a limited view that, once identified, designed, built, and delivered, would be adequate for their information needs. This was always an artificial limitation, an unfortunate (and in reality unnecessary) consequence of and concession to the deficiencies of the business data management and analysis environment.

The past decade has seen the emergence of better, faster, low-friction, tools, technologies, and practices that dramatically narrow the gaps between data and the people who need to understand it.

The past five years has seen the increasing awareness of these tools, particularly with Tableau's recognition by Gartner, Forrester, TDWI, and related media and general audience channels.

The implications of the new opportunities have, as in all paradigm shifts, been slower to bubble to the surface, but they're starting to become part of the discourse, even as the traditional message that there's a canonical set of dashboards that's sufficient for running a business persists.

The modern reality is that it's possible to discover and deliver data-based information on an ongoing basis, including but not limited to a small set of pre-identified KPIs in one or two dashboards. There's a very small distance between dynamic data discovery and the composition of relevant analyses into dashboards—this is a fundamental departure from the traditional BI world, and marks a qualitative shift in how effective business data analysis can be pursued. It's now possible to provide the information people need to make decisions from the relevant data as they need it, even if it's not previously been formalized in pre-constructed forms: dashboards, scorecards, etc.

Organizations that recognize that they're no longer constrained by the traditional limitations can take advantage of the new opportunities and dramatically improve their data-based decision making abilities. One of the first steps is recognizing that they can access, analyze, and understand their data as needed, rather than speculating about future information needs and spending time, energy, and effort tackling technical implementation efforts for potential payoff. As they absorb this concept, people recognize that they no longer need be shackled to one, two, or some small number of discrete dashboards.

Tuesday, November 24, 2015

Hack Academy - Multiple Moving Averages

Hack Academy – explaining how Tableau works in real world examples.

Note: this post is a work in progress.

This session delves into the workings behind the solution to this Tableau Community request for assistance: Showing maximum and minimum with a calculated moving average. Please refer to the Tableau Community posting for the full details.

In the post the person asking for help explained that she was looking to have a chart like this:
and described her goal thus:

"So I want to aggregate the previous 4 years worth of data (and not show as individual years), average it for each week and then display it as a 3 week rolling average (which I've done) and also calculate and display the maximum rolling average and the minimum one too. That way it can be easily seen if the rolling average for this current year falls within the expected range as calculated from the previous 4 years worth of data."

The Solution

One of Tableau's Technical Support Specialists—community page here, provided a workbook containing a solution; it's attached to the original post. She also provided a step-by-step recipe for building the solution worksheet. Cosmetic adjustments to the solution to make it easier to identify and track the Tableau elements have been made.

The blue lines identify the parts and something of the relationships between them. The complexity of the parts and their relationships is difficult for inexperienced people to wrap their heads around.

This post expands upon the solution by looking behind the curtain, showing the Tableau mechanisms employed—what they do and how they work. It does this by providing a Tableau workbook, an annotated set of diagrams showing how the worksheets' parts relate to one another, and explanatory information.

The Solution Recipe
The worksheet's caption lays out the steps for generating the desired visualization. This is helpful in getting to the solution, but doesn't surface the Tableau mechanisms involved.
The rest of this post lays out the Tableau mechanisms, how they work, and by extension how they can be understood and assimilated so they can become tools in one's Tableau toolbox, available for use when the needs and opportunities arise.
The Solution Recipe, annotated
In this diagram the arrows indicate the instructions' targets and the effects of the recipe's steps.
– The green arrows indicate record-level calculated fields
– The red arrows indicate Tableau Calculation fields
– The thin blue lines show how fields are moved
The first thing that jumps out is just how fiendishly complicated this is. Even though less than half of the instructions have been annotated the number and complexity of the relationships is almost overwhelming. In order to achieve analytical results like this, the analyst must first be able to understand this complexity well enough to be able to generate from it the specific desired effects. One of Tableau's deficiencies is that all of this mastering and managing this complexity is left up to the analyst, i.e. Tableau provides virtually nothing in the way of surfacing the parts and their relationships in any way that reveals their relationships in a way that allows for easy comprehension and manipulation.
Once the instructions reach the "2015 Sales" step the diagram doesn't show the Measures in the location the recipe indicates. Instead of them being on the Rows shelf (where the recipe puts !Sales = 2015) they're on the Measure Values card. This is because once there are multiple Measures in play, organized in this manner, they're configured via the Measure Values card and the Measure Names and Measure Values pills. This is one of the things that makes it difficult for people new to Tableau to puzzle out what the parts are and how they work and interact.
Implementing the Solution

The following Tableau Public workbook is an implementation of the Recipe.

The Inner Workings

Although the Public Workbook above implements the solution and is annotated with descriptive information it doesn't go very deep in surfacing and explaining the Tableau mechanisms being taken advantage of—how they work and deliver the results we're looking to achieve. This section lifts Tableau's skirts, revealing the behind the scenes goings on.

Tableau's Data Processing Pipeline
One of the things that can be difficult to wrap one's head around is Tableau's mechanisms for accessing and processing data, from the underlying data source through to the final presentation. Tableau processes data in (largely) sequential stages, each operating upon the its predecessor's data and performing some operations upon it. This solution employs multiple stages; this section lays out their basics, illustrating how they're employed to good effect.

Tableau applies different data processes and operations at different stages—in general, corresponding to the different 'kind' of things that are present in the UI. These stages are largely invisible to the casual user, and their presence can be difficult to detect, but understanding them is critical to being able to understand how Tableau works well enough to generate solutions to novel situations.

The main mechanism: selective non-Null Measures presentation
At the core of the solution is the distinction between data structure and presentation. In this situation there are, in effect, two data layers in play; we represent the stages as layers in order to help visualize them.

The basic ideas are: when displaying data Tableau will only present Marks for non-Null values; and Table Calculations can be used to selectively instantiate values in different layers. The underlying layer is where the data is stored upon retrieval from the database. The surface layer is where presents selected data from the underlying layer to the user. The key to this solution is that Tableau only presents some of the underlying layer's data—that required to show the user what s/he's asking to see.

Demonstration Tableau Public workbook.

This Workbook contains a series of Worksheets that demonstrate these Tableau mechanisms.

This Worksheets are shown below, along with descriptions of what's going on.

Download the Workbook to follow along.

Setting up the data.
The fields used to illustrate the data processing:

!Sales = 2015
 IF DATEPART('year',[Order Date])=2015
 THEN [Sales]
 END

!Sales = 2015 (Null)
 IF DATEPART('year',[Order Date])=2015
 THEN [Sales]
 ELSE NULL
 END

!Sales = 2015 (0)
 IF DATEPART('year',[Order Date])=2015
 THEN [Sales]
 ELSE 0
 END

The major difference between the fields is whether they evaluate to Null or 0 (zero) when Order Date's Year is not 2015. The first two fields evaluate to Null—the first implicitly, the second explicitly. The third evaluates to 0.

This is the distinction upon which the solution's deep functionality depends. Recall that Tableau only presents non-Null data; this solution takes advantage of this by selectively constructing the Null and non-Null presentation Measures we need.

The data structure.

This viz shows the basic data structure needed to support our goal of comparing Weekly Moving Averages for each of the Order Date Years:

  • there are columns for each Week (filtered here to #s 1-5); and
  • each week has 'slots' for each of the four Order Date Years.
Tableau shows Marks for each combination of Order Date Year and Week for which there's data, in this case the Marks are squares. This is one of Tableau's magic abilities that really adds tremendous value in assisting the analytical process (and in many cases is itself a very valuable diagnostic tool).

Showing the Years.

Right-clicking Year (Order Date) in the Marks card and selecting "Label" tells Tableau to show the Year for each Mark.

This confirms the data structure, and is one of the basic steps in building complex visualizations.

The Yearly Sales.

In this viz Sales has been added to the Marks card—Tableau applies its default SUM aggregation and configured to be used as the Marks' labels.

As shown, Tableau uses the Sales sum for each Year and Week as the label.
This can be confirmed to show the accurate values, if desired, via alternate analyses.

Note that the viz shows the actual Year & Week Tales totals, not the Sales compared to the same Week in 2015.

Measures on the Marks card.

In this viz Sales has been replaced on the Marks card by the three Measures shown.

Our objective is to see how Tableau presents each of them vis-a-vis the base data structure.

Presenting the 2015 Sales.

SUM(!Sales = 2015) has been used as the Marks' label. As we can clearly see, there's only one Mark presented for each Week. One may wonder: why is only one Mark shown for each week when we know from above that there are four Years with Sales data for each?

In this case, Tableau is only presenting the Marks for the non-Null measures in each Year/Week cell, because the !Sales = 2015 calculation
  IF DATEPART('year',[Order Date])=2015
  THEN [Sales]
  END
results in Null values for each Year other than 2015, so there's nothing for Tableau to present.

One potential source of confusion is that the "Null if Year <> 2015" result for the !Sales = 2015 calculation is implicit, i.e. Tableau provides the Null result by default in the absence of a positive assignment of a value when the Year is not 2015.

Presenting the 2015 or Null Sales.

This viz has the same outcome as the one above.

The difference is that the calculation for
  IF DATEPART('year',[Order Date])=2015
  THEN [Sales]
  ELSE NULL
  END
explicitly assigns NULL (also Null) to the non-2015 Years' values.

Using an explicit NULL assignment is advised as it minimizes the cognitive burden on whomever needs to interpret the calculation in the future.

Presenting the 2015 or 0 Sales.
Recreating the viz – an alternate method.

From this point we're going to be building the viz from the bottom up, showing how the constituent parts operate and interact with each other.

Sales – Total, 2015, & pre-2015
First up: making sure that the Sales calculations for Sales for pre-2015 and 2015 are correct.
The calculations are correct—they sum up to the total of all Sales.
Configure 2015 Sales to be the 3-week Moving Average
The moving average for 2015 Sales is generated by configuring the SUM(!Sales = 2015) measure as a Quick Table Calculation in the viz.
The Steps:
  1. Activate the SUM(!Sales = 2015) field's menu
  2. Select "Quick Table Calculation", then choose "Moving Average"
    Tableau will set up the standard Moving Average Table Calculation, which uses the two previous and current values as the basis for averaging.
    Since this isn't what we're after, we need to edit the TC.
  3. Select "Edit Table Calculation" (after activating the field menu again)
    Configure as shown, so that Tableau will average the Previous, current, and Next values.
    Note: The meaning of "Previous Values", "current", and "Next Values" is inferred from the "Moving along: - Table (Across)"
  4. The "Compute using" field option shows the same "Table (Across)" value as the "Moving along" option in the Edit Table Calculation dialog.
  5. Add !2015 Sales back to the viz.
    There are a number of ways to accomplish this—most common are dragging it from the data window, and using the Measure Names quick filter.
    Why do this?
    Configuring the Table Calculation in steps 1-4 changed the SUM(!Sales = 2015) field in the Measures Value shelf from a normal field to a Table Calculation field (indicated by the triangle in the field's pill). Adding SUM(!Sales = 2015) back to Measures Values provides the opportunity to use its values in illustrating how the Moving Averages are calculated.
How it works:
For each Moving Average value, Tableau identifies the individual SUM(!Sales = 2015) values to be used then averages them.
The blue rectangles in the table show individual Moving Average values, pointing to the referenced "SUM(!Sales = 2015)" values.
There are three different scenarios presented:
  • Week 1 — there is no Previous value, so only the current and Next values are averaged.
  • Week 4 — averages the Week 3 (Previous), Week 4 (Current), and Week 5 (Next) values.
  • Week 7 — there is no Next value, so only the Previous and current values are averaged.
Note:
There's no need to include SUM(!Sales = 2015) in the visualization to have the Moving Average Table Calculation work. I've added it only to make explicit how Tableau structures, accesses, and interprets the data it needs for the presentation it's being asked to deliver.
Pre-2015 Sales 3-week Moving Average - the default configuration
Please note: this is implemented using a persistent Calculated field coded as a Table Calculation: !Sales < 2015 Moving Avg
This is a different approach than using the in-viz configuration of the 2015 Sales shown above. There are differences in the two approaches, some obvious, some subtle.
The Steps:
  • Add !Sales < 2015 Moving Avg to the Measures Shelf as shown.
    As mentioned above, it can be dragged in from the Data Window, or selected in the Measure Names quick filter.
How it works:
When Tableau puts !Sales < 2015 Moving Avg in the viz it applies the default configuration as shown. In this viz the use of Table (Across), as shown in both the Table Calculation dialog and the field's 'Compute using' submenu, provides the desired functionality, i.e. averaging the appropriate !Sales < 2015 values, based upon the field's formula:
  WINDOW_AVG(SUM([!Sales < 2015]),-1,1),
resulting in:
  • Week 1 — only the current and Next values are averaged.
  • Week 4 — averages the Week 3 (Previous), Week 4 (Current), and Week 5 (Next) values.
  • Week 7 — only the Previous and current values are averaged.
Add Order Date Year to Rows
Adding the Order Date Year to Rows instructs Tableau to construct a set of the Measures for each individual Year in the Order Date data.

Note that the Measures are only instantiated for those Years for which they are relevant, i.e. the pre-2015 Measures only have values for the years prior to 2015, and the 2015 Moving Average only has values for 2105.

Having these Year-specific values sets the stage for the next part: identifying the Minimum, Average, and Maximum of the pre-2015 Yearly Moving Averages.

For example, as shown in the viz, these values and Min/Max for Week 1 occur thus:
2012 – 36,902 - Max
2013 – 30,669 - Min
2014 – 34,707
and the Average of the Yearly Moving Averages is: 102,278 / 3 = 34,093

How Tableau accomplishes constructing the Measures for this viz is beyond the scope of this post, and it can get complicated.

Add the pre-2015 Sales Moving Average Minimum
Part 1 - add the Field
Drag !Sales < 2015 Moving Avg - Min from the Data window to the Measure Values shelf as shown.

Tableau generates a value for each Week for each Year—for the Years prior to 2015.
   This image has been cropped to show only 2012 & 2013.

As shown in this image, for each Year, all of the !Sales < 2015 Moving Avg - Min values is that of the minimum of the Weeks' values for !Sales < 2015 Moving Avg for that Year. This is because Tableau's default configuration for a Table Calculation Measure added to a viz is Table (Across).

In order to achieve the desired calculation - that each Week's value for !Sales < 2015 Moving Avg - Min reflect the minimum of the values for that Week for the individual Years, we need to configure !Sales < 2015 Moving Avg - Min in the viz, directing Tableau to perform the calculation in the desired manner.

Part 2 - configure the Field
The Steps:
  • 1..2 – operate as shown
  • 3..4 – select the "Compute using: | Advanced" option
    Note the active/default Table (Across) option; as explained above, this is why the default calculation finds the minimum value among the Weeks for each Year.
  • 5 – move "Year of Order Date" from "Partitioning:" to "Addressing:"
    Partitioning and Addressing are fundamental aspects of how Tableau evaluates and calculates Table Calculations. Covering them is beyond the scope of this post.
    Googling "Tableau partitioning and addressing" will lead to a robust set of references.
  • 6..7 – "OK" & "OK" to apply the configuration.
title
image
...

Monday, June 15, 2015

Make Dashboards' formatting consistent.

Has this ever happened to you?

In the course of normal events dashboards get built with a hodgepodge of styles.

For example, in the Workbook to the right the

  • default dashboard has not had any formatting applied, while the
  • formatted dashboard has been formatted.

It's sometimes desirable to have a consistent Dashboard look and feel without going through the tremendously tedious manual process of configuring them individually.

Tableau lacks the ability to enable this, either by setting the defaults you want, or by applying formatting to Dashboards in bulk.

But you can now do it, simply and easily.
There's a Tableau Tool for applying the formatting from a template dashboard to all of the dashboards in a set of workbooks.

Here's a dashboard with the formatting to apply to a selected set of Workbooks.

We'll see below how to make this happen–it's a pretty simple matter of running the appropriate Tableau Tool in the directory where the Workbooks are.

The tool will take the Template formatting and apply it to all the dashboards it's pointed at.

For the tool to work there are two important aspects to this Workbook:

  • The Workbook is named
    Template.twb
  • The Dashboard is also named
    Template.

Of course, if one wants to use another Workbook and Dashboard name, it's easy to reconfigure the tool to accommodate them.

The formatted Dashboards.

Here are the default and formatted dashboards re-formatted with the Template formatting.

Important points about the formatting:

  • Only the formatting shown above will be applied;
    there are other things one might wish to configure, but there are complications that go along with them.
  • All of the Template Dashboard's formatting will be applied, even if some part of it hasn't been configured;
    it's possible to implement finer-grained control of what formatting gets applied, but that gets complicated, beyond the scope of this initial formatting approach.

The tool: SetDashboardFormatAll.rb
is available on GitHub here, or can be copy/pasted from below.


  #  Copyright (C) 2014, 2015  Chris Gerrard
  #
  #  This program is free software: you can redistribute it and/or modify
  #  it under the terms of the GNU General Public License as published by
  #  the Free Software Foundation, either version 3 of the License, or
  #  (at your option) any later version.
  #
  #  This program is distributed in the hope that it will be useful,
  #  but WITHOUT ANY WARRANTY; without even the implied warranty of
  #  MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
  #  GNU General Public License for more details.
  #
  #  You should have received a copy of the GNU General Public License
  #  along with this program.  If not, see <http://www.gnu.org/licenses/>.
  
  require 'twb'
  require 'nokogiri'
  require 'csv'
  
  $templateTwb  = 'Template.twb'
  $templateDash = 'Template'
  $twbAppend    = '_styled_'
  
  puts "\n\n"
  puts " Setting Workbook dashboard formatting, using the formatting"
  puts " from the #{$templateDash} dashboard"
  puts "   in the #{$templateTwb} workbook\n\n"
  
  $csv = CSV.open("TT-FormattedWorkbooks.csv", "w")
  $csv << ["Workbook","Dashboard"]
  
  def loadTemplate
    return 'Template.twb not found' unless File.file?('Template.twb')
    twb  = Twb::Workbook.new('Template.twb')
    dash = twb.dashboards['Template']
    return 'Template dashboard not found' if dash.nil?
    style = dash.node.at_xpath('./style')
    return '  ERROR - no style available from Template dashboard.' if style.nil?
    puts "   Dashboard styling:"
    styleRules = style.xpath('./style-rule')
    if styleRules.empty?
      puts "\n\t  Template dashboard formatting is default style."
    else
      styleRules.each do |rule|
        puts "\n\t Element: #{rule['element']}"
        formats = rule.xpath('./format')
        formats.each do |f|
          puts sprintf("\t -- %-16s : %s \n", f['attr'], f['value'])
        end
      end
    end
    puts "\n"
    return style
  end
  
  def processTwbs
    path = if ARGV.empty? then '*.twb' else ARGV[0] end
    puts " Looking for TWBs using: #{ARGV[0]} \n\n"
    Dir.glob(path) do |fname|
      setTwbStyle(fname) unless fname.eql?($templateTwb) || !fname.end_with?('.twb')
    end
  end
  
  def setTwbStyle fname
    return if fname.eql?($templateTwb) || fname.include?($twbAppend + '.twb')
    twb = Twb::Workbook.new(fname)
    dashes = twb.dashboards.values
    puts sprintf("\t%3d in: '%s' ", dashes.length, fname)
    return if dashes.empty?
    dashes.each do |dash|
      node  = dash.node
      style = node.at_xpath('./style')
      tmpStyle = $templateStyle.clone
      style.replace(tmpStyle)
      $csv << [fname, dash.name]
    end
    twb.writeAppend($twbAppend)
  end
  
  $templateStyle = loadTemplate
  if $templateStyle.class == 'String'.class
    puts "\t #{$templateStyle}\n\n"
  else
    processTwbs
  end
  
  $csv.close unless $csv.nil?

The Tool in action
Here's the code being run.

In this case the execution command specifies that only the single Workbook ExampleDashboards.twb will have its Dashboard(s) formatted.

Upon startup, the tool looks for the Template Workbook and Dashboard and, assuming it finds them, prints out the formatting found there.

It then looks for Workbooks, either all of them, or those matching the single command line parameter. Those that it finds have any Dashboards they contain formatted to the Template configuration.

It then looks for Workbooks, either all of them, or those matching the single command line parameter. Those that it finds are listed with the number of their Dashboards, if any, and have their Dashboards formatted to the Template configuration.

By default, the Template-formatted dashboards are written to a copy of the original, with '._styled_' appended to the name. This is a precaution, ensuring that the original Workbook isn't harmed in the process. Adjusting the Tool to apply the formatting directly to the Workbook is a small change, easily made.


 ...$ ls -1 ExampleDashboards*.twb
 ExampleDashboards.twb

 ...$ ruby "{path to}\SetDashboardFormatAll.rb" ExampleDashboards.twb  


  Setting Workbook dashboard formatting, using the formatting
  from the Template dashboard
    in the Template.twb workbook

    Dashboard styling:

          Element: table
          -- background-color : #fae7c8

          Element: dash-title
          -- font-weight      : normal
          -- color            : #000000
          -- font-size        : 14
          -- background-color : #f0d9b6
          -- border-color     : #b40f1e
          -- border-style     : solid

          Element: dash-subtitle
          -- font-size        : 11
          -- font-weight      : normal
          -- color            : #b40f1e
          -- background-color : #d7d7d7

          Element: dash-text
          -- text-align       : center
          -- color            : #b40f1e
          -- background-color : #e1e8fa
          -- border-color     : #1b1b1b
          -- border-style     : solid
          -- font-family      : Comic Sans MS

  Looking for TWBs using: ExampleDashboards.twb

           2 in: 'ExampleDashboards.twb'

 ...$ ls -1 ExampleDashboards*.twb
 ExampleDashboards._styled_.twb
 ExampleDashboards.twb

 ...$

Friday, June 5, 2015

This changes everything - Autodocumenting Workbooks

Your Workbooks can document themselves, now that programmatically creating and injecting Dashboards that document Workbooks into them is a reality. And it's free.

Consider the Regional Sample Workbook. It has six separate vizzes, but one easily can't tell whether any of them is a Dashboard without visually inspecting it. Nor can we see which Worksheets a Dashboard includes without opening it, and there's no way to tell how all the Dashboards relate to all the Worksheets, or the Worksheets to the Data Sources they access.

Imagine, if you please, what it would look like if there was a way to have the Workbook -> Dashboards -> Worksheets -> Data Sources relationships teased out of the Workbook, rendered graphically, and then added back into the Workbook as a self-documenting Dashboard.

Would you like that? Would it be handy? Useful? Maybe? Until now, this sort of thing has been difficult, awkward, and manually intensive. If it was doable at all.

Not any longer. It's now simple and straightforward, with a minimum of fuss, for one workbook, or a whole boatload of them.

Here it is — the Regional Workbook, autodocumented,
with the map of its Dashboards, Worksheets, and Data Sources automatically generated and injected into it.

Adding documentation to Workbooks, where it really matters, has always been a manual, laborious process, with so much friction that it hasn't been feasible at scale. No more. Tableau Tools' now has the ability to inject dashboards into existing dashboards automatically. Coupled with its existing abilities to generate useful content about Workbooks, this opens entire new horizons for enriching Workbooks with surprisingly little effort.

How the magic happens.

The simple version: run a simple Ruby script in a directory containing Workbooks to be documented, and presto! the Workbook(s) are documented with the desired content; in this case the D->W->DS maps for each.

Run this code in a directory containing Workbooks.
# Copyright (C) 2014, 2015 Chris Gerrard # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program. If not, see . require 'twb' puts "\n\n\t Documenting Workbooks with Tableau Tools" puts "\n\t Adding Dashboard -> Worksheet -> Data Source graphs" puts "\n\t https://github.com/ChrisGerrard/Tableau-Tools" puts "\n" path = if ARGV.empty? then '*.twb' else ARGV[0] end puts "\n\t Files matching: '#{path}'" Dir.glob(path) do |twb| puts "\t -- #{twb}" twb = Twb::Workbook.new(twb) dotBuilder = Twb::Util::TwbDashSheetDataDotBuilder.new(twb) dotFile = dotBuilder.dotFileName renderer = Twb::Util::DotFileRenderer.new imageFile = renderer.render(dotFile,'png') dash = Twb::DocDashboardImageVert.new dash.image=(imageFile) dash.title=('Dashboards, Worksheets, and Data Sources') twb.addDocDashboard(dash) twb.writeAppend('dot') end

The code is available on GitHub here.
Not much to it, is there?
Tableau Tools does the work, needing only a little glue code to compose the specific functionality.

Here's the code running in a directory containing Tableau Sample Workbooks.

As the code runs it picks up all the *.twb files in the directory and processes each in turn.


  Dir.glob(path) do |twb|
  puts "\t -- #{twb}"
  twb        = Twb::Workbook.new(twb)

Each file has its D->W->DS map built with:


  dotBuilder = Twb::Util::TwbDashSheetDataDotBuilder.new(twb)
  dotFile    = dotBuilder.dotFileName
  renderer   = Twb::Util::DotFileRenderer.new

The D->W->DS map image file is added to a Dashboard which is then injected into the Workbook with:


  dash       = Twb::DocDashboardImageVert.new
  dash.image=(imageFile)
  dash.title=('Dashboards, Worksheets, and Data Sources')
  twb.addDocDashboard(dash)

The Workbook is written out, as a copy with '.dot' added to its name with:


  twb.writeAppend('dot')

There's no technical reason to create a new copy of the Workbook, and in a production environment the Workbooks are mostly documented in place, i.e. keeping their own names.

Dependencies

Ruby 1.9.3
is preferred, almost exclusively because the Nokogiri gem that's required under the covers works seamlessly with 1.9.3 (on Windows), and not so seamlessly with Ruby 2.x. See the Nokogiri section below for more information.


The Twb gem
is required, and is declared as such via "require 'twb'", the first executable (non-comment) line.
The gem is installable via "...> gem install twb", which assumes that "gem " is installed, which should have happened when Ruby was installed.


 ... Tableau Sample Workbooks> gem install twb
 Successfully installed twb-0.3.2
 1 gem installed
 Installing ri documentation for twb-0.3.2...
 Installing RDoc documentation for twb-0.3.2...

 ... Tableau Sample Workbooks> gem list twb

 *** LOCAL GEMS ***

 twb (0.3.2)

 ... Tableau Sample Workbooks> 

(post-publishing) note:
The correct gem version is 0.3.2 (as of this writing) – there was a glitch in the gem publishing that I didn't catch at the time of the original post resulting in an obsolete version of it being installed.
Thanks to Philip, Vishwanath, and Matthew for reporting this.


The Nokogiri gem
Nokogiri is an XML and HTML parsing gem that's used by the Twb gem. If it's not installed the 'require' statement in Twb will fail.
Nokogiri is installable thus:


 ... Tableau Sample Workbooks> gem install nokogiri
 Fetching: mini_portile-0.6.2.gem (100%)
 Fetching: nokogiri-1.6.6.2-x86-mingw32.gem (100%)
 Nokogiri is built with the packaged libraries: libxml2-2.9.2, libxslt-1.1.28, zlib-1.2.8, libiconv-1.14.
 Successfully installed mini_portile-0.6.2
 Successfully installed nokogiri-1.6.6.2-x86-mingw32
 2 gems installed
 Installing ri documentation for mini_portile-0.6.2...
 Installing ri documentation for nokogiri-1.6.6.2-x86-mingw32...
 Installing RDoc documentation for mini_portile-0.6.2...
 Installing RDoc documentation for nokogiri-1.6.6.2-x86-mingw32...

 ... Tableau Sample Workbooks> 

note:
Nokogiri is sensitive to the Ruby version being used.
I'm using Ruby 1.9.3, mostly because: it's stable; suitable for my purposes; inertia; and Nokogiri didn't work with 2.x Ruby in the limited testing I did.
There's some documentation online about Nokogiri and Ruby 2.x but I've not had the time to puzzle out how to get them to play nice together.


Graphviz
The diagrams are created using Graphviz - open source graphing software, which can be downloaded and installed from here.

Graphviz location In order for the graphs to be rendered, Graphviz needs to be available for use by the Twb Twb::Util::DotFileRenderer object, which assumes the default Graphviz installation: 'C:\tech\graphviz\Graphviz2.38\bin\dot.exe'.
'dot.exe' is the Graphviz program that renders this particular graph type—there are many other types, each with their specific rendering program.

If Graphviz is installed into another directory, it can be communicated to the Twb rendered by adding the following line to the code (new line bold):


  renderer   = Twb::Util::DotFileRenderer.new
  renderer.gvDotLocation=('dot.exe location') 


To infinity, and beyond.

With the Workbook injecting nut cracked, there's no end to the things that can be done. Whatever can be thought up can be created and added to your Workbooks.

But wait, there's more.

Workbook autodocumenting relies upon Tableau Tools' ability to modify and write Workbooks. This core feature makes it possible to compose functionality to accomplish pretty much anything you could want to do with and to your Workbooks. The door is open, the future beckons, and it going to be a fun ride.

About Tableau Tools

Tableau Tools is open source software for interrogating and manipulating Tableau Workbooks and other artifacts. It has two main parts as of this writing:

  • The TWB Ruby gem, available on Github at: https://github.com/ChrisGerrard/TWB, is the core element. It models Workbooks and their major components, allowing for easy access to and control of them. Based on Nokogiri, it provides the opportunity to employ Nokogiri when access and manipulation of deeper components for specific uses is needed.
  • The Tableau Tools project, also available on GitHub: https://github.com/ChrisGerrard/Tableau-Tools contains tools for Tableau assessment and manipulation, including all manner of useful tools that can be composed into richly functional suites with minimal fuss and bother.