Rewriting the LMS Story: Part 1

Part 1 of Part 1: Telling Rewriting the LMS Story

Note for my semi-regular (not so much lately) readers: Apologies for my absence. I recently took a position with a new organization.  The transition has soaked up all of my cycles but I think things are starting to normalize.

What would you build to enable access to structured learning opportunities?

If you were going to design a scalable system that enabled broad access to training opportunities across a large, geographically dispersed organization, what kind of system would you build? If you wanted to show an organization’s collective progress toward proficiency and readiness, what kind of system would you design? Chances are, many of the characteristics and features you would design into such a system would look similar to those offered in a typical LMS.

This piece isn’t meant to bash LMS products, but to question the premise and focus of this type of tool. LMS products are well intended by both the vendors that build them and the organizations that deploy them. Few enterprise systems focus on individual pursuits at such a granular level. LMS deployments *can* be a tremendous strategic asset. Features of LMS *can* deliver great value for every employee in the organization. Unfortunately, many (maybe most) LMS deployments… don’t.

The problem(s) with LMS deployments

For many folks that use LMS, these tools are anathema. This perception could stem from a misguided notion (on the part of those that would purport to manage learning) that an internal biological process (learning) can be regulated by a loosely coupled external technological stimulus (management system.) Moreover, many of the most common LMS features apply more energy to providing for those that watch the system than providing for those that use it to improve their own performance. This imbalance describes much of the problem I have with the idea of LMS. Systems focus on launch or attend and track. They tend to exist for the benefit of the organization as watcher, not the person as experiencer.

What characteristics exemplify these problems?

Again, this isn’t necessarily a swipe at LMS vendors. Customers drive priorities and shape implementations. Without pointing fingers, here are 6 of my least favorite LMS characteristics expressed as pseudo-requirements. I think you’ll find at least one thing here that resonates with your own experience.

1. The system must provide an environment that makes it easy to unleash ideas that torture the relative few onto the rest of the organization.


The label LMS implies a certain paternal push mentality. We push things to our “learners” in a content monologue. At worst, this monologue is a poorly produced, unreliable, dysfunctional mess with manifestations tantamount to torture. At best, it’s often content on a conveyor belt.

When we make it easy to distribute information, we also make it easy to broadcast well-intended but poorly selected and designed messaging. Much of the e-learning we see hung on learning management systems creates shallow communication without creating deep conversation. Broad messaging deployment with low costs of entry can tend to create pollution in an already polluted stream. Sure, an LMS can be super-efficient…

“There is surely nothing quite so useless as doing with great efficiency what should not be done at all.”  ~Peter Drucker

At some point, for better or worse, we all have an idea that we think will benefit the business. We don’t always consider whether or not we should, or how much it will actually matter. The LMS provides a platform to propagate ideas that compete for the attention of the participant. We don’t always coordinate these broadcasts well.

2. The system must dispense events within a narrow set of human contexts.

Let’s face it. One of the LMS’ primary value propositions is that of an event dispenser. Folks come to the LMS, find something in the catalog, and register for a self-paced or facilitated event.


The problem is not that the system functions as an event dispenser. The problem is the classifications of events focus on a very narrow set of human contexts. Most LMS focus on the solo frame and the group (one to many) frame while ignoring the rest of the spectrum. 


People are often found in pairs, teams, communities, and societies. Funny how we work. A too narrow focus prevents us from connecting people with people where and when they are and in the ways that we tend to gather naturally. While the LMS doesn’t need to provide all of these opportunities, the system with the right design won’t exclude them and, more importantly, won’t prevent smooth flow between them. It’s not about bolting social tools onto an LMS, it’s about being mindful of opportunities and opening system discovery and matching features to take advantage of these contexts.

3. The system must disconnect “Learning” from work

One of the problems I, and many others, have with the “Learning” management system has to do with the lexicon and language used to describe the container system as well as the stuff that it’s supposed to dispense. There is no question that people *can* learn from pathways created within an LMS. However, the events themselves aren’t learning. People learn. Events don’t.


Perhaps the greatest fault of the Learning Management System is the propensity of these fortresses of content solitude to completely isolate learning experiences from work experiences. Whether or not this is intentional, events within the system are typically deployed in ways that remove the participant from work contexts to receive the training experience. While this isn’t always a bad thing, this configuration doesn’t provide for the boundless potential of learning experiences that are connected with a real needs and work challenges. Some of the best learning experiences are directly connected with work. Many of the best work experiences are directly connected with learning.

4. The system must create unpleasant user experiences as a rule.

An LMS content library is a collection of objects. Some of these objects seem to be designed to defecate directly into the soul of the participant. Add this to the catalog link dumping ground problem and usability issues… Horrifying.

horrified_learningPhoto Credit

5. The tool must be activated as “yet-another-IT-system”, completely independent of similar systems

Photo Credit

Large enterprises are full of systems. These separate systems can seem to be in competition for operational territory. Sometimes these systems talk to each other. Sometimes they all use the same authentication pool. Sometimes systems complement the strengths of other systems and bond together to make things easier on the people that use them. Sometimes… they don’t.

6. The system must focus the most energy on the things that mean the least to the people the system is intended to help

dilbertImage Credit

In our attempts to control everything that happens in our organizations, to know who is doing what they have been assigned, and to create reports for reports sake, we create monsters. These monsters focus more on features that feed the panopticon cycle than helping people do what they need to do. Even though I question the usefulness of a CYA reporting mandate in actually affecting behavior, external reporting mandates and other hierarchical reporting requirements are necessary. However, these features shouldn’t be the most prevalent uses and value propositions of the LMS.

…that sounds hopeless…

These characteristics, while common, aren’t rules. They don’t need to apply universally to systems intended to help people. We can do better. The future’s systems, better systems, might not be recognizable (or labeled) as LMS. This would suit me fine. Rather than chase management of learning, maybe systems of the future will focus on Work and Learning Support, with an emphasis on work and support.

This is the vision I’m pushing for in my organization and I think we’ll get there. In my view, the first step is dropping the ideas of management and learning exclusively as a packaged event. Big change. Big promise. Worth changing our collective mindset about LMS.

Do, Believe, Be

In the past weeks I’ve seen a bit of discussion surrounding fast and slow learning experiences and the strategies for setting up and supporting each. Some voices advocate exclusively for a chronic learning journey, bemoaning things like performance support and on-demand learning. Others yearn for a more acute experience, providing just what folks need when they need it. Very few of these discussions seem to draw a dividing line between on-demand learning and support experiences (just in time) and protracted learning campaigns (spaced over time). But there does seem to be some polarization on the gradient between the long-game and the short-game. Why does it need to be one or the other? Can’t we have both?

There is room for both a short and a long game for those looking to increase proficiency, maintain readiness, and get stuff done. In my view, there are three potential goal categories of change we want to encourage and assist. The goal category can be an indicator of short or long strategies. Here’s the way I think about teeing up opportunities for both short and long learning experiences.


1) We want workmates (and / or they want themselves) to be able to DO something they weren’t able to do before. To accomplish a task or demonstrate a skill that contributes to an outcome.

2) We want folks to BELIEVE or value something that’s important to the organization, to themselves, or improves the treatment of fellow humans. For example, it may be difficult to perform if you don’t believe in the organization, the product, the process, or yourself. Sometimes, many times, choosing the hard to do right thing is about valuing the right thing more than the easier to do wrong thing.

3) We want our partners to fill a job role and BE what they can be to the organization or for themselves. Advancing from apprentice to master, climbing the career ladder from entry level to master of craft. Growing to meet their potential.

This mapping illustrates three related categories of accomplishment. Thinking this way, there’s a probably a greater chance that a DO goal will match a short-game strategy than a BE goal. Not always, but it’s likely that DO will be shorter term than BELIEVE and BELIEVE shorter than BE.

This categorization could also provide some relational framing. For example, a BE goal could contain several smaller BELIEVE and DO goals, shorter strategies or campaigns that contribute to a long game strategy. A DO goal could contain a BELIEVE goal. A BELIEVE goal could contain DO goals, and so on. Mapping these frames could create a very complex picture of goals and sub-goals.


This picture drives a conversation and produces questions that give birth to other questions. What business measure do we want to improve? What opportunities do we want to queue up? What does the entire canvas look like and when can we help? How is the business impeding (or providing counter-incentives to) these goals? When is it best for us to get out of the way and let people take control and plot their own course?

“DO, BELIEVE, and BE” is one way to frame the conversation around the ways we might want to move the needle. How do you frame your conversations?

The Experiences, Support, Reflection Cycle

Clark Quinn penned a post last year titled Reimagining Learning. Inspired by Clark’s concept, I built this diagram to illustrate the structures of how I understood the concept and relationships between elements.


This relationship and cycle agree with the investigative work of Ericksson, Prietula, and Cokely (summarized for the masses by Malcolm Gladwell as “10,000 hours makes an expert”.) Ericksson, Prietula, and Cokely connected studies by notable folks such as Benjamin Bloom to posit & highlight three components most common to building high levels of expertise:

  1. Deliberate practice. Progressive application of skills and experience performing tasks in authentic work contexts are key to mastery and development of expertise. Even though this seems like a no brainer, how often do we see a focus on content over practice in analogue and digital contexts?
  2. Expert coaching. Feedback and guidance make critical connections with deliberate practice. How often do we turn folks loose to learn on their own? How often do we provide generalized feedback vice adaptive expert feedback? Stumble through the mountains or journey with a sherpa. Which yields more consistently positive results?
  3. Enthusiastic support. Encouragement is so critical to every endeavor we pursue. Have you ever continued down a path that you otherwise would have abandoned simply because a family member, friend, or supervisor was your personal cheerleader? Yeah, that.

The experiences, support, and reflection cycle illustrated above is one way to weave in the components we know to be most helpful in developing expert performance.

I love the concepts Clark expresses in his reimagined engagement cycle / formation. This concept carries bits of cognitive apprenticeship, emphasizing key practices of reflection and sustained focus on relevant, authentic activities (making stuff, experimenting, and ample deliberate practice.)

Is this a perfect formula for developing expertise? Probably not (perfect formulas aren’t perfect). But it’s a great starting place if you’re in it for the long haul.

If we care about developing proficiency (in ourselves or helping to encourage and facilitate proficiency in others) we had better be in it for the long haul. Fast-food style training services may get our folks “fed up”. We can’t build champions on a fast-food menu.

Trust and Chaos

I’ve long held this theory about the relationship between trust and chaos (which often results in conflict). The theory started out like this.

Trust and chaos have an inversely proportional relationship. As trust nears zero, we’re all going to be in serious trouble.

This statement implies that as trust goes up, chaos decreases and vice-versa. The more I think about it, the more I think this isn’t exactly right. How could it be right? If it were this simple, trust would be universal, right? And we know this isn’t the case. Not even close.

I do think there is a direct relationship between trust and chaos. Trust is at the heart of most, if not all, of the conflicts and problems on this little planet. But the relationship is far from simple.

Earned & Granted

There are two ways I’m now thinking about the relationship between trust and chaos. The first is a simplified categorization of trust. This categorizes trust as earned and / or granted.

Trust Venn Diagram

This is the first opportunity for conflict. And in this logical equation, multiple situations can arise:

  • Trust can be earned but not granted.
  • Trust can be granted but not earned.
  • Trust can be earned and granted.
  • Trust can be neither earned nor granted.

Only one of these situations minimizes conflict. A combination of these situations can compound conflict or chaos. For example, if one member earns but isn’t given trust while another is given without earning, ugly conflict is sure to brew below the surface. Many workplaces have their share of empowered incompetents. Conflict in these environments is probably significantly higher than environments with equitable distribution of trust. Ideally, trust is granted until removed (temporarily) as a consequence.

Which workplace do you think operates at potential? The workplace with low trust / misplaced trust or the one with high trust / equitably distributed trust?

Local & System Trust

Mario Vittone, a friend of mine, recently retired from Coast Guard active duty. Back in 2008, he wrote a fantastic article titled “The Missing Competency“. In his article, Mario writes that trust is the missing competency in leadership development. The component of leadership that is seldom spoken of in formal development programs and rarely practiced with intent. He gives some great advice for increasing local trust, in particular methods you can use to increase your personal trustworthiness score. Read Mario’s stuff! You won’t regret it.

But I see local trust and system trust differently. I see tight coupling with local trust and loose and complex coupling with system trust. Think about how much you trust the U.S. Congress this year or last year. Now compare that with two decades ago, if you can remember back that far. Is your trust of that system component higher or lower? Now think about large banks and Wall Street. Do you have higher or lower trust in these systems than you did a decade ago? System trust isn’t universal and it’s far more complex than local trust.

A Trust Curve

I’ve revised my original theory about the relationship between trust and conflict or chaos to look more like a curve. And this curve isn’t the same for every context.

Trust Curve

Trust seems to have a band of effectiveness on average. While I think we should 1) grant trust more often and 2) push for more trust, I don’t think 100% trust is the answer in every context. There is likely a threshold of trust that provides the best return on energy used to build the trust, beyond which conflict could be higher.

For example, consider the much maligned position of IT services within most organizations. Opening up 100% trust provides a risk calculation that is uncomfortable or unacceptable to those that govern IT resources and are charged with protecting the organization, its business, and its data. In this situation, zero trust is completely counterproductive (though common). 100% trust (though rare) opens the organization to risks that IT management believes are unacceptable. So the default response to many requests for an increase in trust or access is no. Right or wrong, this is a common position. It’s a balancing act that tends to err on the side of caution.

Diminishing returns

Within any optimized trust band (best discovered through experience and honest evaluation), there lies a point of diminishing or negative return. The hard part is figuring out where this balance is and leaving both benefit of the doubt and options to increase the window of trust when the returns are positive. This becomes a game of trade-offs. One of min-max where you’re aiming to maximize the positive and minimize the negative. It’s almost never simple.

Low trust makes things that should be easy excruciatingly hard to get done. Unbalanced assignment of trust creates subsurface conflicts and can destroy a team’s motivation.

How do you maximize trust and minimize chaos?

ETMOOC: Pressure Systems – MOOCs like the Weather


I’m a week into the Educational Technology and Media MOOC (etmooc) experience. This style of experience is difficult to describe. It’s a BIG conversation between hundreds of (1500, or so) folks from all over the globe with similar interests, across multiple channels, in a large and loose structure. The course leverages Twitter, Google+, live sessions, and blog aggregations from participants to expose these conversations to the group. Yeah, 1500 participants mixing across multiple tools. If you don’t have well tuned filters, this can be seriously overwhelming. It’s chaos. But not necessarily in a bad way.

As this experience coalesces, I see people doing many things for the first time. First blogs. First time using Twitter. Sharing different types of simple media production tools and producing video narrative to “make learning visible”. As the experience progresses, folks are beginning to form smaller groups of resonance in this great big space. Through a natural process, smaller spaces form within the bigger space.

This is the most fascinating thing to me about this experience. Observing the social dynamics of a very large group, engaged in a really big conversation is pretty fantastic. Each participant working around a couple of big questions:

  • How are you making your learning visible?
  • How are you contributing to the learning of others?

Here’s the way I am thinking of the MOOC. It’s working like a pressure system, behaving with dynamics not entirely unlike weather patterns. The loose course structure and the promise of learning about how to improve pedagogy with communication technologies create a pressure well, drawing folks in. Participants come from different fields and backgrounds, with different perspectives, and with different levels of commitment and intensity.

Not entirely unlike a weather system, differences in pressures (interests) form eddies. Groups of interest are starting to emerge. I anticipate this will continue through the remainder of the course.

This isn’t my first MOOC but it is my first connectivist MOOC experience. It may also be the first MOOC I finish. Enrolled in a few content MOOCs but combination of linear structure and no real incentive caused an attention dump early on.

I’ve been inspired by the passion of the educators in the group. It’s been a positive experience. Maybe I’ll make it through this one. Time will tell.

ETMOOC Introduction

I’m Steve Flowers and here’s how you can build a mental model of my construct.

Note for my 2 (or so) regular readers: I’m participating in #etmooc, a massively open online course for education technology and media. This is assignment 1. I think there are still open spots if you want to join in.

1. Steve’s personal and professional life are hopelessly intermingled

headshotMy personal and professional life are a seriously commingled mess. By that, I mean that it’s difficult to see where one stops and the other begins. I suspect that this is fairly common in folks that are passionate about what they do.

On the professional side, I have worked for the U.S. Coast Guard in one form or another for longer than 20 years. I was an active duty electronics technician for just over 10 and I’m currently a performance technologist / solutions consultant with the USCG Performance Technology Center. That means I’m sometimes a consultant, sometimes a developer, sometimes a producer, most times a designer (technical, communication, instructional, or process — really whatever the situation calls for, I try to fill the need). I build digital stuff like EPSS, tools, and eLearning.

I’m a tool pilot and tend to master tools quickly. There aren’t many classes of tools I haven’t driven or taught others to drive. Consequently, I’m less interested in tools than I am in outcomes, goals, results and the covert or tacit things that contribute to outcomes, goals, and results.

2. At the heart of it, Steve is (proud to be) nothing more than an assistant

If you remove all of the lofty terms that define a discipline, at the heart of it I’m just an assistant. I assist the apprentice, the journeyman, and the master get done what they need to get done. I try to enable growth and development where I can but I try to stay out of the way.

3. Steve writes stuff and participates in communities

There’s a book I co-wrote with a couple of other fine fellows to support new Articulate Storyline users. I also participate regularly as an Articulate MVP (Super Hero) on Articulate’s E-learning Heroes Community.

4. Steve likes his friends and his network. 

I’m fortunate to have a network of really smart friends that like the same stuff I do and a global network of tools that makes it easy to stay connected. If you’re reading this, you’re automatically one of those friends. So, thank you.

5. A lot of things interest Steve, here are a few of them:

  • Application of technology to help people 1) get things done and 2) develop their skills. Particularly interested in mobile / accessible and Web-based technologies and structures. Have written a few things on the xAPI (Tin Can API).
  • Use of everyday household technology to generate things of use to other people. This includes a strong interest in the use of video to capture and share authentic experiences or articulate ideas. This tech is ubiquitous. The barrier isn’t gadgetry, it’s habitry. I made it my goal to use a tablet / device to generate much of the media I’ll produce this year and I hope to share some things I’ve learned with the folks in this group. Did I mention that I have a technology addiction?
  • Profiles, lenses, patterns, and frameworks are a passion of mine. I think lexicons of work and decision paths are important to validating a body of knowledge. I don’t believe we do enough in the education and training space to communicate the things that work and consequently we rarely discuss why. To me, this is critical to the development of (an admittedly fractured) professional discipline.

6. Steve is opinionated as all get out. Here are some things Steve has opinions about:

  • Culture as the core influencing signal on behavior (how we are influenced by and influence this signal)
  • How often we seem to avoid showing our work and end up arbitrarily making decisions about delivery mediums and media with big holes in our data support.
  • The biggest problems in education center around 1) a focus on information [vice capability] as a central object 2) poorly designed and administered assessment mechanisms that result in misplacement and mismatches between certification and competence. Both of these are common but not universal. That’s what makes the problem so darn hard to diagnose.
  • Everything (yes, everything)

7. Steve is happy to be on this journey with y’all.

Happy to be here. I anxiously anticipate the adventure.

Wavelengths of Change

In a meeting last week, we discussed adoption of mobile technologies. I suggested that different folks would operate and adopt at different wavelengths. So a one size fits all expectation might not work out. Some will take a shorter focus and try to change something small without seeking a larger goal of system alignment.


Some efforts will seek that broad signal and gradually move towards a holistic change. When I initially thought about it, I thought in terms of two wavelengths. The first, I thought, was evolutionary and the second revolutionary. On reflection, it really seems like there’s more to it than this.

Maybe efforts could be categorized into three areas of focus:

  • Isolated transformation (incremental / isolated)
  • System evolution (measured / holistic)
  • System revolution (radical / holistic)

This isn’t about novel innovation for novel innovation’s sake (which tends to sound like “because it’s cool”). Think about this type of change as an effort to scale up ways of doing business that make a positive difference.

Isolated transformations tend not to be aimed at long-term system change. This type of change seems to be common. Think about the small, incremental transformations you’ve seen. If these don’t lead to more holistic evolution, where do they lead? And why don’t the goals of these micro-transformations connect with more holistic goals? I imagine that the answer is that these types of changes are easier to accomplish and less threatening to organizational control structures. If I wanted to avoid changing the system, why would I make any efforts toward change?  Everything is connected. Isolated transformations don’t make much sense to me.

That’s not to say that hacks and smaller efforts can’t tie to a larger system goal. It’s just that usually… they don’t. If you’re familiar with e-learning deployments, you’ve probably seen insulated efforts that don’t connect or tie to bigger picture efforts. While these might be loosely tied to a business goal, they tend to be “one and done” and “fire and forget”.

System evolution defines a tempered approach that migrates and scales organizational practices, policies, and processes toward something better.  System evolutions happen over time but always start with the end in mind.

System revolution is a less tempered approach that can violently disrupt a larger system to enact broad change. Like the system evolution, revolutions start with the end in mind but through the force and speed of the change can cause collateral damage to organizational structures. When systems are really broken, a revolution might be called for.


In every case, successful holistic changes are driven by visionary foresight.  If you’re driving toward a change, what kind of change wavelength are you on?

Showing Our Work: Design Economics

Showing Our Work
Photo by Irving Rusinow [Public Domain], via Wikimedia Commons
You start a new project and receive a heap of content from stakeholders. Your shiny new stack of content arrives with an expectation from the project owner that you deliver that content (every ounce of it) to the mental doormats of their audience. In an attempt to do the right thing, you dutifully conduct an analysis of the performance & skills that the program you’re helping needs to address. You get to know the audience. You frame a set of performance objectives around a reasonably rigorous analysis.  Despite this effort, you’re still stuck with big pile of content.

The tail wags the dog. You dive into the rabbit hole head first. You search for aesthetically pleasing layouts and graphics that will grab attention. You look for ways to make an awful experience less awful. Soon you’ve invested a ton of time trying to engineer treatments for content and you’ve lost sight of the purpose of the solution (if ever there was a purpose).

Sound familiar? You bet. When you enter into a battle with content treatments, it’s easy to get lost in the noise. That noise makes it easy to lose sight of the things we can do to really make the biggest difference. Unfortunately, this unenviable situation is far too common. It seems unfair to give information this much of our attention, doesn’t it?


In the performance solutions field (ISD, learning experiences, whatever you want to call what we do), we often look at other fields of discipline for guidance and inspiration. We often examine other design disciplines or areas of media production, looking for better ways to position and polish our communication. As helpful as these fields might be, we may be overlooking an important, albeit less sexy, discipline. The field of economics may hold one key to making solutions worth the effort we put into them.

Economics is a process that converts inputs that have economic value into outputs that also have economic value.

In design, we deal with many factors and artifacts (currencies). Ideally we’ll weigh all of the inputs we can to make decisions that provide the best trade-offs and consequently the best outputs and value for all stakeholders, given the inputs we have to work with. Content is just one of these currencies. Other currencies include:

  • Target accomplishments
  • Tasks that enable the accomplishments
  • Skills that enable the tasks
  • Components of the skills that enable the tasks
  • Audience factors
  • Environment factors
  • Technology supports / constraints
  • The client’s wants
  • The client’s needs

An experience itself is defined by economics. Good experiences consist of moments that drive towards the design intent or the goal of the solution. That’s a deep one, I’d rather go down that route in another post.

Good economics supply greater value in the outputs than the inputs, making the outcome worth the effort. While it’s not as sexy as User Experience (UX) or game design, doesn’t economics sound like a field worth looking into?

Make the outcome worth the effort.

Often, maybe too often, we make decisions that seem to draw magic lines between content and solution without consideration of the intent of the aggregate of those decisions. Starting with purpose in mind is a sensible guideline. It’s easy and intuitive to point to this and say “Yeah, that’s the right way to do it” but it’s challenging to follow in practice. I guess that’s why we call it a discipline.

Here are two suggestions you can use to battle the magic leap and increase your effort to outcome ratio.

  1. Look at your solutions as economic problems. Map the structure of your problem so you can know exactly how the structure of your answers matter in the big picture. Understand the structure of the task down to the skills and all of the factors that shape the execution of the skill. Know what concepts drive the skill. Know common interpretations of a rule that influences the execution of the skill. Use this structure to validate your decisions. If something doesn’t stick, dump it.
  2. Don’t start with content. Don’t get hung up on content. While content is often an important artifact, it’s the wrong starting point. And in many cases it hadn’t earned the effort we often invest in it. Getting lost in the (often horrible) noise of treating content is a great way to lose sight of the big picture and is a terrible way to deliver value. It’s hard to win the performance war if we tie all of our energy up in a content battle. Content is a means to an end. Starting with a focus on content will make it a slog for you and a slog for your participants. Pick another starting point.

Everything we do and every choice we make has a cost. Every cost stacks debt. That debt can manifest as time, money, disappointment, or simply things you need to fix to make it work (costing time, money, and disappointment). This debt adds up. Design is an economic problem. How do you account for your design debt?

In follow-up posts, I’ll talk a little about a process I’ve been exploring to expose a deeper look at the currencies we use in design and a way to change the tendencies we have that give so much weight to content.

How are you showing your work?

Tech, People and Systems

The article below was written for an internal audience to illustrate the potential of the Experience API (Tin Can API). This brief overview contains narrative descriptions of four use-cases.


The use-cases described below illustrate projections and opportunities that could help to resolve some organizational challenges. The use cases described below do not represent plans. This article doesn’t necessarily represent the viewpoint of the U.S. Coast Guard or the Department of Homeland Security.

The DoD ADL Initiative is working on a set of technology interoperability standards to enable training and learning systems. This new set of standards is called the Training and Learning Architecture (TLA). The first technology standard to emerge from the TLA is the Experience API (XAPI) also known as the Tin Can API. Differing from current systems that focus on connecting content with systems, this standard creates a language and technology framework that focuses on connecting people with systems. More specifically, the Experience API enables systems to capture the actions, activities, experiences, and accomplishments of people. Continue reading Tech, People and Systems

Artificial Competence

This post will mix a kitchen remodel with some video game nostalgia. Believe it or not, these have something powerful in common. Artificial competence isn’t a bad thing if it gets the job done.

Chances are, you’ve played Super Mario Brothers at some point in your life. Super Mario Brothers is a side-scrolling platform game where you, the player, control Mario. Mario can walk, run and jump through puzzles, defeating enemies using permanent or temporary abilities. The designers of the game strategically placed performance support power-ups throughout the game to provide options for defeating puzzles and enemies. Without these power-ups, tasks within the game are more difficult to accomplish.


A kitchen remodel

We’ve been planning to remodel our kitchen since moving into our townhome 8 years ago. The old pine cabinets and light blue countertop had each overstayed their welcome. This year we took the plunge and decided to do it ourselves. I have some construction skills but had never tackled anything as complex as a kitchen. I was going to need a special kind of magic to get this done without help. Continue reading Artificial Competence