Immersive Learning

Surprise aboard Ambition

That’s Engagement! A burning tanker has just exploded right in front of their drone’s camera. A part of their immersive learning experience at the National Flight Academy

I had to let  everyone see this photograph of young people engaged in a humanitarian mission to rescue people from a disaster at sea. They are controlling this mission from the Joint Operations Center aboard Ambition, a new aircraft carrier outfitted with DARPA experimental aircraft called X-12 Triads, and were observing, via a drone they sent out,  the scene of the disaster where another squardon that they are controlling are picking people up out of the water and fighting the fire threatening them. The fire is aboard an unregistred tanker carrying toxic and highly flammable cargo and a thunderstorm is reaching its peak. At the moment of this photo, the drone happened to be looking at the tanker when lightning struck the tanker and it exploded.  I would have to say that at this point, the learners’ immersion is complete. They will overcome this initial shock, jump into action and eventually save everyone in immediate danger and hand the operation off to the Coast Guard.  At this point, they have been aboard for 5 days and have become quite adept at complex problem solving and critical thinking.  They are doing sophisticated math in an applied setting, chemistry, physics, and other topics in the pursuit of the solutions of the problems they face every day.  Don’t you wish every young learner’s face had that expression when faced with a complex problem they have to solve?

Really Emerging Technologies

The Institute for Human and Machine Cognition champions the concept of cognitive protheses

The Institute for Human and Machine Cognition champions the concept of cognitive protheses

For most people,  computing technology is something that extends a capability we already possess. Productivity tools help us produce more work. Media help us shift time; that is, we can observe something that happened at another time. Some computing technology helps us shift place far more dramatically than we would ever be able to do on our own; like going to the moon.

One area that is only now beginning to be exploited is a field that I have heard Dr. Ken Ford at the Institute for Human and Machine Cognition call “cognitive prostheses”. The concept is simple enough in description; using computers to entend the capabilities of our own cognitive processors, our brains, but extremely difficult to do in practice. This brings me to the photograph to the right.

The subject here is wearing a headband that has an optical sensor ( the CCD chip from a digital camera) mounted in front. You will notice that he is holding something in his mouth that has a cable attached to it.  The image information from the camera chip is sent to a computer that converts that signal into current, which is fed to a little flat wafer the subject is holding on his tongue that has an array of electrodes transmitting the voltage pulses into a pattern through the electrodes in contact with his tongue, much the same way an LED display activates individual pixels to create a digital image on a computer screen. The subject’s tongue is sending the pulses from the array, via the nerve endings in his tongue to his brain.  His brain is routing that information to the vision center of his brain, where he is able to interpret those signals as visual information.  He can “see” faces and play tic-tac-to.

Let me give you a moment to process that………

So, now that you are back,  I’m sure you were thinking to yourself, “…and all I’ve got is this stupid smart phone.”

My point is that this type of innovation is only just beginning.  When I go back and clean out the postings from this blog in a couple of years, I am going to look at this and it will look as primitive and quaint as the grey brick cell phones from the 1980’s look to us today.

Keep in mind (and now I am certain that you are beginning to question exactly what the term “mind” means now) that these not even new primary technologies, it is just leveraging existing technologies in a creative, and I will argue extraordinary, way.

Can you Run Faster than a Robot?

A Running Robot

The 70mph Running Robot

While I was showing a group of visitors from Annette Island, Alaska around our area, we stopped in to see what is going on at the Institute for Human and Machine Cognition labs. They were on a STEM tour of Florida, and had been out to see the NFA and the EPA labs out on Sabine Island. Our stop at IHMC really struck home how valuable the creative processs is to fostering innovation that leads to what we have been calling emerging technologies, but also how truly disruptive that creative process is to organizations.  

I’ve been thinking a lot about this since reading a book recently called “The Other Side of Innovation: Solving the Execution Challenge” by Vijay Govindarajan and Chris Trimble. Let’s take IHMC as an example. When people come to the Institute for help solving a particular problem, the Institute bring together a team from a very diverse collection of disciplines from all over the world. They agree to work collectively on the problem from their own unique perpectives.  The outcome is usually a solution that gets handed off to someone else to refine into a product or a process. The group assembled disbands and each member moves on to something else, with or without the IHMC.  This is an extreme example of the disruptive nature of innovation. A core support team and leadership remains over time, but the rest of the organization is in a constant state of flux.

Few organizations in our culture could withstand that type of whirlwind, and I am certain that as you read this, you are having a hard time imagining the place where you currently work operating under those circumstances……for now.  As the types of technologies that facilitate communication, collaboration, learning and creative problem-solving evolve, and as many “processes”, not only those involved with production or materials handling, but also those involved with tasks now performed by mid-level management people, become automated or supported by automation, the picture changes. It’s certainly a very threatening picture for many people. 

When I think of the transition that I have personally observed in my own organization it is especially vivid.  When I first started working for the University, we had clerk-typists that worked in our office.  The typwriters were replaced, overnight by main-frame terminals that supported a word processing activity.  Supervisors still wrote documents out in longhand on yellow pads and handed them in to be turned into text. When PCs first became available, only the clerk-typists and secretaries who supervised them got them. Soon, everyone wanted one and that meant the supervisors and managers got them. They were soon creating their rough drafts on a word processing document and handing a disk to the now “Word-Processors”, who finished them off and printed them out for delivery. 

The internet hit, and our word-processors could email documents to other word processors who would print them out and put them on the managers’ desks to read.  It didn’t take long before managers were sending their own documents to other managers who read them on their own computer. At that point, a whole class of employees faced an existential dilema. Up to that point, we had just been using new tools to work in the same way we had worked prior to the advent of the new tools. This was entirely different.

Some people simply chose to retire. Others sought new jobs, but a core of the staff re-trained upwards to accept more responsibility, which in turn made managers more effective and ready to take on more responsibility.  In our case, we were able to grow the number of students at the University at a rate an order of magnitude greater than the rate of growth of support staff to serve them. This is a very simplistic, very early example of the disruption emerging technologies can produce; even on institutions that haven’t changed that much over the past 600 years.

The point is that the rate of change in organizations is certainly going to accelerate. Those who respond to this change by continuous learning, by looking for ways to exploit innovation and by adopting a forward approach to  personal improvement will do very well, and will probably avoid getting crushed by 70 mph running robots.

So, Why Me?

Life out on the edge of things is tricky, but somebody's got to do it.

Life out on the edge of things is tricky, but somebody’s got to do it.

How did I come to be the instructor for the Emerging Technologies course?

I suspect that most of the reason has to do with having been fortunate enough to just be standing there at critical points in UWF’s technological development and being foolish enough to act on opportunities that everyone else seemed to be hesitant to take on. Without making any claims to being better at this than anyone else, I will say that I am curious to a fault and, having raised three kids, have learned not to be very concerned about looking foolish (coming to work every day with barf on the back of you shoulder, or a ring of drool stains around your knees does that to you.)

If you are interested, I’ll run down a sort of chronological list of interesting predicaments in which I have found myself over the years. This will not only give you some perspective on what makes me tick, but will also explain how I have come to be the boring old guy you see before you today.

One of the first technology projects in which I was engaged was the capture and digitization of a video clip to play before a meeting of the Board of Regents (yes, that long ago). It took two weeks to configure the capture card, record the video, compress it by hand and produce a 160×120 image in a standalone player that ran on a 3-gun cathode ray projector that required the entire room to be completely dark to see the shadowy 12fps 1 minute clip.  It was the first one ever presented on campus.

My unit at the time was the pioneering adopter of a local area network of PC computers. At the time, all computing was done through terminals attached by coax cable to a mainframe. We had 8086 IBMs running DOS and Novell 1.0 over Token Ring at a blinding 4MB/s. We even had a file server with a 10MB hard drive, another first. I initiated the first digital inventory system for our media equipment, and actually managed to get our film and video collection into the library online catalog, though it was still mainframe-based at the time.

We were looking for ways to improve the effectiveness of faculty lectures and turned to evaluating and using multimedia authoring tools. I learned and taught Macromedia Authorware, Asymmetrix Toolbox and a dozen in between. Eventually, PowerPoint became useful. I still teach Dreamweaver, Photoshop and Flash for our Office of Continuing Ed. just for the fun of it. (having three kids in college is a strong motivator)

As the internet caught fire, we looked for ways to avoid making every course from scratch by hand as we had been doing and I ended up selecting and teaching our first two Learning Management Systems (TopClass, and WebCT), and served on the evaluation committee for all of them since.

I experimented with ISDN-based video conferencing in the desktop, when every step had to be configured by hand, and then designed and implemented the next three generations of interactive digital video distance learning systems used on campus.

I was heavily involved in the design of our campus’ optical fiber backbone, and in the evaluation and selection of the Siemens telephony system that we use today.

I managed the development and delivery of the first completely online undergraduate degree program at UWF. This program was also the first undergraduate program to be delivered via handheld PDAs, back when they were still called PDAs under contract to the Coast Guard and Navy.

I was the designer of the underlying technology and worked on the design team for the development of a reusuable object-based integrated teaching and learning system for middle-school science teachers called QuickScience. It had a repository of standards-aligned objects, a scope and sequence builder for the teacher to use to assemble the curriculum from the repository, a pretesting system that adapted the standard curriculum load for any individual student with customized combinations of content to remediate areas needing the most work. The system included an adaptive master delivery model that both let the learner move on once mastery was established, but also flagged the learner for individual attention when the remediation component was exhausted without mastery.

I was a Florida Orange Grove Scholar, a sort of evangelist for a statewide learning object repository.

I was the first on campus to use elluminate! and later Skype for advising as well as teachng.

The Next Exit History project, (http://nextexithistory.com) which includes a free iOS and Android app you can download now and use, is based on the underlying architecture and design I have been developing for the past few years, called the TellusPoint Engine.

I have also been heavily involved in the design, develoment and implementation of curriculum for the National Flight Academy and the related Aviation Classroom Experience (ACE) classroom projects. This involves a host oftechnologies assembled and aligned to create an immersive learning environment to enable learners to develop STEM skills aligned with state and national standards in a highly engaging experience.

So, why me? I think it’s because I am starting to see patterns in how these things happen, rise, and fade away.

What’s next? I don’t know, but I can’t wait to find out, and as the Blue Angels always say, “I’m glad to be here.”

Where Do Emerging Technologies Fit Into Planning?

Painted into a corner

Perfect execution, bad planning.

Strategic Planning, or perhaps more accurately, the consequences of the lack of strategic planning, has been on my mind a lot recently. I hear people, some of whom I respect, say that the pace of change brought about by the emergence of so many disruptive technologies has effectively rendered strategic planning a futile effort for technology-based organizations.  I think this attitude results from a  fundamental confusion between strategic planning, and operational planning. Yes, emerging technologies are certainly disruptive to operational plans. If you were a large H.320 codec manufacturer with a heavy investment in hardware-based solutions, the advent of H.323, and software codecs certainly disrupted your operational plans, and if your strategic plan was to dominate the hardware codec market, you simply did not make it.  However, if you were one of the very few whose strategic plan was to provide advanced communications to people and businesses at commodity prices, that strategic goal led you to avoid making decisions that might bring your operational plans into confict with that. So is there risk? Certainly so; you have to decide to do something, but you mitigate that risk by always checking to confirm that you are best serving your strategy with your actions. This means a lot of change, and change is expensive. You mitigate that expense by making choices that provide the most flexibility. So, to recap, strategic planning, is an existential exercise.  Good strategic planning involves having a clear vision of what you want to be and good operational planning builds in the flexibility and financial resources to take advantage of things that emerge to help you better get there.   Avoid misinterpreting operational plans as strategic plans. If you adequately state your strategic goals, the operational process for getting there lays itself out in front of you as long as you are nimble enough to take the right steps when they come along.

Moving Stairs at Hogwart's

The Moving Staircase at Hogwart’s

Diffusion of Innovation – Can it be managed?

Everett Rogers(2003) tells us that there are 5 dimensions or criteria for predicting the successful diffusion or adoption of a technology.

  1. Relative Advantage
  2. Compatibility with existing values and practices
  3. Simplicity and Ease of Use
  4. Trialability
  5. Observable results.

If we know these are the criteria, why can’t we control these factors to insure that  an innovation spreads? This isn’t a secret and Rogers has actually been writing about it for far longer than 2003.

Probably, the answer is implicit in the criteria themselves. If the innovation itself accommodates these criteria, then it will probably catch on, but if something is hard to learn and you can’t try it out without exposing yourself to the judgement of your peers, you most likely won’t adopt it. However, if you are in a position to manipulate those conditions so that even though the inherent properties of the innovation do not meet the criteria, you may create an environment where adoption is likely to occur.

This is the thing that has been a serious issue with many people who move into a technology leadership role from within the ranks. The culture of their sub-organization creates a very different environment then what is prevalent in the rest of the organization and they fail to recognize what must be done to create the environment to support their innovative projects.

How Can You Tell?

What is the difference between an emerging technology and an innovative application? When we used to talk about new technologies, we were talking about hardware; personal computers, side-scan radar, insulin pumps. What about now? Is Cold Fusion, or ASP.Net an application, or a technology? What about AirPlay, or Google Drive?  What about things that started out as hardware, but are now software, like RAID controllers, or video capture?

You might ask the question: “Does It Matter? I think it does. One way to deal with this is to reframe the question by asking what things, in terms of infrastructure and supporting resources have to change in order to implement something. This means that it’s not a binary choice, but rather a continuum between two endpoints. The more change required, the more like a technology something is. What do you think?

Competency-based Assessment?

“What will happen when certifications and badges evolve into valid credentials. The DOE is pushing badges as a way to recognize education. Will the states support it? Can traditional Higher Education adjust. Will credentials be validated by communities of users?” Gilfus Education Group – Authored by: 

Are badges a legitimate replacement for traditional education credentials? What Technologies Make them possible and useful to learners?

What are the emergent, disruptive technologies that foster the development of communities of practice? What is the currency of validity for these communities and do traditional academic institution have any relevance within them? I think much of this concept is a direct descendent of the concept of Learning Object Repositories. Once knowledge could be defined in suficiently discrete units with attendant assessment and documentation and with tracking capability, all of the pieces were in place for the evolution of a culture that recognizes unique collections of skills assembled for specific purposes. These collections are dynamic, can be rapidly acquired and assembled in customized sequences to produce “just-in-time” learning for a specific purpose. Where does that leave the traditional university?

Just a Helpful Hint

Many people do not realize that in a WordPress blog, you can comment on a post by clicking on the little caption balloon in the top right corner of the post.

There Are Patterns Everywhere

Street Musicians

The music is in the musician, not the instrument

Looking at this photo from a scene in one of my favorite european cities, Salzburg (and it’s not my favorite just because the Sound of Music was set there or because Mozart lived there.. though those are pretty cool things to explore when you visit), I am struck by a truth that I discovered in one domain, but came to realize applied to many others.

The arc of a particular technology is pretty much the same whatever domain it serves.

Some of the most talented musicians I have ever met use some of the most awful, beat up, seriously out of whack equipment you can imagine, but when they play it, they take advantage of the things that gear does well, while minimizing the negative effects of the things it can’t do. As the caption  of the photo says, “the music is in the musician, not in the instrument.” They make amazing music out of what they have.

Now, back to technology.  When a new tool or technology comes out, the early adopters and the gadgeteers rush in to get the technology working for them, mostly because it’s new, but also because they see a glimmer of utility in it.  Lots of other people who watch these early adopters  join in and pretty soon people are writing project justifications for  “game-changing technologies”

As with many aspiring musicians who believe that” if they only had that one new cool thing”, they would sound so much better, people in the technology field are also always looking for that one new thing that will make their <fill your own pet project here> what they always dreamed it would be.  Of course both groups are disappointed.

Sometimes there are simply new technologies that really do constitute a significant gain in capability, but most of the real advancement comes from creative people exploiting the things they can scrounge together to make interesting new combinations of features or whole new capabilities greater than the sum of the pieces.

So, if you are confused about where emerging technologies come from and how they get there, look for the places where people are already doing creative things with what they have. If you are looking for that one new technology that will take your effectiveness in whatever you do to the gates of Nirvana, don’t stop looking, but don’t fail to see the things sitting right in front of you, just from a different perspective. – d3