Dictating With Dragon

How do you unroll training for a high-value-yet-optional transcription tool? Vanderbilt University Medical Center started by asking doctors their delivery preference.

Getting physicians to participate in training can be difficult, especially if it’s training for an optional tool not tied to regulations or required education.
“We can force them to take training if we have to, but it’s definitely not the most effective strategy,” said Dr. Paul Sternberg, chief medical officer for Vanderbilt University Medical Center in Nashville, Tennessee.
So when the learning department wanted to get doctors to switch from using a transcription service to voice recognition software to record patient notes in their electronic health records, or EHR, they had to take a different approach.

VUMC is a network of award-winning teaching hospitals and clinics in the Nashville area, as well as the schools of medicine and nursing associated with Vanderbilt University. The network has nearly 20,000 employees, and at least 3,000 employed physicians and health care providers.
These health care providers have extremely busy schedules. A big portion of their days were spent on handwritten patient notes to be sent to a transcription service — at roughly $8 per job, said Dr. Ian Jones, head of the emergency medical department. His department alone was spending about $500,000 per year on transcription services, and the department is just a fraction of the physicians and support staff who use the service.
In 2013, Jones’ group created a pilot for Dragon, speech recognition software that translates voice to text, and offers customer versions for medical professionals dictating notes into EHRs. “It’s like Siri for health care,” said Brent Moseng, health operations system consultant for VUMC.
The software would allow them to dictate all of their notes and eliminate transcribing costs. To teach the physicians how to use the software, the learning department arranged for one-on-one training sessions with each doctor. Trainers often worked overnight shifts to accommodate the entire staff. “It took two months to train 80 staff,” Jones said.
But the effect was significant. Within months, the department cut its transcription costs by 90 percent. “We are just a small fraction of people using dictation at Vanderbilt,” he said. “If we put this software in the hands of everyone who takes patient notes, the savings would be monumental.”

How Do You Like to Learn?
The leadership at Vanderbilt agreed. Following the success of the Dragon pilot, the information technology department decided to roll out the software to the rest of the organization, and the learning and development group had to figure out the best way to get all 3,000 physicians and staff trained.
They quickly realized that a one-on-one training model would be too expensive and time-consuming — they estimated it would take 18 months and several trainers to get it done. It also wasn’t the most effective learning environment for most of the staff, said Anne Marie Danko, managing director of operational performance. “We know everyone learns differently, and each person knows how they learn best.”
To figure out the best learning approach, she created a short survey to find out how Vanderbilt physicians and staff preferred to learn. The survey asked questions like, “When you need an answer quickly, what is your preferred search method? Search the Internet? Read a manual? Email a colleague? Call a help desk?” Their responses would not only reveal their learning style but also identify how likely users would be to embrace the new technology.
Based on their answers, Danko ranked respondents into three categories: early adopters, majority users and late adopters. “Early adopters want the latest and greatest technology, and they are not afraid to figure things out on their own,” she said. Late adopters, on the other hand, are cautious and require more handholding. She found most respondents fell into the “majority user” category, open to learning on their own but like to know they can get help if they need it. Only 10 percent were considered late adopters, who would likely need instructor-led training.
Based on the results, Danko’s team decided to build a do-it-yourself online learning course with optional drop-in clinics to provide additional live help if needed. Making the live training optional had a powerful psychological effect, she said. “If we told them they had to go to a clinic, they would have grumbled, even if they needed it. By letting them know it was just an option, we got no complaints.”
The DIY online course began with the basic learning modules provided by Dragon, then Danko’s team added short modules customizing the software for the Vanderbilt environment. They included quick start guides, videos of staff physicians using the tool with Vanderbilt EHR templates, screen shots demonstrating how to set up the system, and tips on how to adapt the software to specific dialects and medical specialties.
Once the modules were developed, they rolled out a controlled training pilot to a volunteer group of early adopters. They completed the training and gave Danko’s group feedback on its effectiveness and potential areas for improvement. “We also encouraged this group to be champions for other users,” she said.
Superusers Drive Interest
Those champions were key to get more cautious physicians to make the leap, said Dr. Shubhada Jagasia, vice chair of clinical affairs and one of the early adopters. She used Dragon for dictation prior to the roll out, and was featured in some of the videos in the training module. She also became one of the program’s biggest champions. “My role is to let everyone know how I use the system, and how it has made my life easier. Once people see the time and efficiency of doing it this way, it wins them over.”
Having these superusers scattered throughout the population was key to get people to try the software and take the training, said Sternberg, an executive sponsor for the initiative. “Dragon isn’t something they have to use, but it is something that we think will improve workflow and the quality of their work,” he said.
Because it wasn’t tied to a regulation, the only way it was going to be effective was if physicians wanted to take the training. Early adopters created a buzz about the software’s benefits, and the training team built on that buzz by sending out emails, talking about the training in faculty meetings, and handing out headsets and information about the software to the staff. After that, getting buy-in was easy, Sternberg said, noting that even radiology physicians — who they expected to be slowest adopters — are now totally on board. “Once they took the training, it stuck like glue.”
More impressive, within six weeks all 3,000 physicians and staff completed the training and were using the software. “The time savings alone was huge,” Danko said. During this six-week time frame, her team created a schedule of drop-in support clinics at each facility site where users could stop by to practice with the software and get help if desired. They also staffed a 24/7 help line for additional support.
Sternberg was one of the doctors who took advantage of this added face-to-face training. “I’ve been practicing medicine for a long time, and while I knew I could figure Dragon out, I wanted to be able to ask for help if I needed it,” he said. He used the online training to figure out the basics, like how to activate the software and turn on his mike, but when he had questions, like how to edit a mistake, he stopped by one of the clinics and spent an hour working with a coach. “It was really helpful to have someone walk me through it.”
The learning and development team doesn’t have hard numbers on how many people used the clinics, but estimates come in at roughly 10 percent of the user population. Compared with the original plan to offer live training to all 3,000 staff, training time and cost savings were substantial, Danko said. The robust software adoption cut transcription costs by $1 million in the first three months.
Just Ask
When it comes to training doctors, or any busy professionals, learning leaders have to first determine whether it’s a requirement or something that will add value. Sternberg said each requires a different psychology around implementation. If it’s training for something that adds value, like the Dragon software, communicate why the new tool or process will make users lives better, and how the training will help them achieve their goals.
VUMC made an effort to understand learners’ attitude about the training need, asked them how they wanted to learn the material, and accommodated those needs in the instructional design. “If you give people information in the format that is easiest for them to use, they are more likely use it,” Danko said. That may require creating multiple learning paths, and incorporating a variety of learning styles in content, but it’s worth it to ensure learners’ needs are met, without wasting a lot of time and money on resources no one will use.
Taking the time to survey users and to build training to meet all of their needs was key to the success of this program, VUMC’s Moseng said. “We listened to what people wanted, and it really paid off.”