Identifying the State of the Art in E-Learning with the Innovation, Instruction, and Implementation in Federal E-Learning Science & Technology Conference
Scotty D. Craig
Human Systems Engineering, Arizona State University
Download the PDF
Training and educational organizations are rapidly changing to support their stakeholders within the e-learning setting. Leaders within these organizations must actively work to stay up to date on best practices within the field. The Innovation, Instruction, and Implementation in Federal E-Learning Science & Technology (iFEST) Conference is the premier conference on distributed learning, bringing together thought leaders, innovators, and senior officials from government, industry, and academia to collaborate and share the latest challenges and innovations in the field. The conference offers innovative keynote talks, panel sessions, interactive activities, exhibits by industry, and talks from individual presenters. Topic areas include digital learning science, learning technology, learning data, technology interoperability, policy, and an annual timely topic that changes each year.
Training and educational organizations are rapidly changing how they support their stakeholders. These changes are driven by technological innovations and the need to provide education and training to larger numbers of learners at a rapid pace (Graesser et al., 2019). Many of these learners are immersed in online learning environments. Based on the 2019 numbers, the most recent numbers available at time of print, 7,313,623 students were enrolled in online education courses at the postsecondary level in the United States or 44.2% of the student population (National Center for Education Statistics, 2020, 2021). Even traditional classrooms are changing by increasing use of technology to offload direct instruction thus allowing instructors to facilitate higher level learning (e.g., flipped classrooms and technology-enhanced classrooms; Enfield, 2013; Roehl et al., 2013).
The high pressure of providing education and training within this rapidly growing technological environment often requires rapid decisions based on limited information. Unfortunately, such demands can result in well-meaning decision-makers pursuing suboptimal or misleading choices. Decision-makers often cling to traditional methods (e.g., in-person lectures) instead of innovating (Allen & Seaman, 2013), in part due to beliefs that technology-supported techniques are less effective. This is not true. E-learning (Means et al., 2013) and blended/flipped/technology-enhanced classrooms (Liu et al., 2016) can be just as effective as traditional classrooms, and in some cases, more effective. However, to be successful, there must be a deliberate consideration of the needs of learners and the organization, support for those needs, and willingness to explore state-of-the-art techniques for addressing the needs (Craig & Schroeder, 2020; Craig, Schroeder et al., 2020). Leaders of training and educational organizations as well as other members of the organization must stay up to date with best practices within the science of learning, current trends learning technology, and learn from effective policies on learning implemented by other organizations (Craig & Schroeder, 2020).
Facilitating Knowledge of Best Practices for E-Learning at iFEST
The Innovation, Instruction, and Implementation in Federal E-Learning Science & Technology (iFEST) Conference is an ideal conference to assist training and education organizations stay up to date on the state-of-the-art learning practices and procedures related to learning with technology. The conference is “the premier conference on distributed learning, bringing together thought leaders, innovators, and senior officials from government, industry, and academia to collaborate and share the latest challenges and innovations in the field” (Advanced Distributed Learning Initiative, 2021). First started in 2003, iFEST just finished its 18th successful annual conference. The conference is jointly organized by the National Training and Simulation Association and the Advanced Distributed Learning Initiative. The call for ideas for submissions to iFEST runs from around 15 January to 15 March. The conference is normally held around the end of August or early September.
The 2021 meeting of iFEST was held from 31 August to 2 September 2021 (see Figure 1). The conference was held online; however, most iFEST conferences have been in person in Washington, D.C. The conference had 525 attendees that spanned the public, private, nonprofit, and academic sectors. The bulk of the attendees were from the federal government/military backgrounds who also received free attendance to the conference.
The conference offers information from many difference formats. It offered three real-time keynote talks with a government keynote from Dustin Brown, Senior Executive Service, deputy assistant director for management, Office of Management and Budget; a legislative keynote from Rep. Robert C. Scott (D-VA), chair of the U.S. House Committee on Education and Labor; and a military keynote from Maj. Gen. Donn H. Hill, Combined Arms Center deputy commanding general–education, Army University provost, and Command and General Staff College deputy commandant. In addition, the conference had panels on modernizing training for integrated operations and innovation in government learning systems. Both included candid discussions of experts from both the U.S. government and around the world. The live elements of iFEST also included eight activity sessions where attendees could engage in hands-on activities/training and an exhibit hall where attendees could interact with cutting-edge companies and organizations in the field. In addition to these, the conference had prerecorded 27 presentations and 13 prerecorded posters.
Each year, iFEST focuses on five common lines of effort plus an annual timely topic. For the 2021 conference, the topics and their descriptions are provided in the Table.
Digital Learning Science
Technology is now a more important component within the learning process. However, the fundamental principles of how humans learn have not changed. For humans, learning is messy. The act does not take place in a sterile environment, nor is it automatic. Learning is individualistic, sometimes spontaneous, but often very effortful, slow, and gradual, and moves forward in fits and starts (Hattie, 2009). Because of this, training and educational organizations must support the needs of the stakeholders, ensure that appropriate resources are allocated, and ensure buy in from all stakeholders (Craig & Schroeder, 2020; Giattino & Strafford, 2019; Moore & Kearsley, 2011; Muilenburg & Berge, 2001). Thus, it is important for educational decision-makers, instructional designers, and instructors to understand the best practices for learning and implement them to the best of their ability and resources. In the remainder of this section, we have summarized the basics of human learning that could be supported by well-organized, state-of-the-art e-learning.
Digital Learning Science at iFEST. This year’s iFEST conference had three talks pertaining to the digital learning science area. These ranged from very specific such as training effectiveness of augmented and virtual reality and the role of instructors in personalized learning to broader methods for heuristically evaluating learning organizations for compliance with science of learning best practices.
Learning technology encompasses a large swath of space from basic websites or PDF-based e-books to highly interactive learning systems that use artificial intelligence to personalize experience for learners. These technologies range in effectiveness. Noninteractive technology such as e-books are little more than a PDF page-turner and are not particularly effective or liked (Daniel & Woody, 2013). Computer-aided instruction such as an e-book expanded with video-based modeling and predictive questions is considered more effective (Craig et al., 2018). A dynamic personalized system such as an intelligent tutoring system is the most effective (Kulik & Fletcher, 2016; Ma et al., 2014). One review of the literature even showed that dynamic systems such as intelligent tutoring systems are as effective as one-on-one human tutoring (Vanlehn, 2011).
Learning Technology at iFEST. With 26 presentations, learning technology area was the largest and most popular topic at iFEST. This is a pattern that has been repeated in the last few years. Most of these talks involve a presentation over a specific system containing an overview of the learning science principles that support them, evaluations of the system, and information on how the systems have been applied in the field.
To modernize courses and enable information sharing, learning technologies must be able to collect and output learning data. There are several popular standards for data. xAPI is an example of one method for capturing, standardizing, and sharing human performance data. Within xAPI, all learning experiences can be represented as interactions both internal and external to the online environment (Murphy et al., 2016). These data can be stored within databases for later analysis via learning analytics and data mining techniques. The output of these analyses can then be used to optimize future learning through increased personalization (e.g., of learning materials or processes) or data visualizations (e.g., dashboards that offer feedback or recommendations to students, instructors, or administrators). Additionally, these data can be used to detect unproductive learning behaviors (Papamitsiou & Economides, 2014) and even cheating behaviors (Chuang et al., 2017). Long and colleagues (2015) implemented personalization and visualization strategies within a rifle marksmanship course, resulting in a nearly 40% reduction in training time. Although this approach is promising, additional research is needed to determine the best practices for implementation and impact.
Learning Data at iFEST. The uses of learning data are broad, and this is reflected in the 10 presentations at this year’s iFEST. The topics ranged from specific how-to applications such as using captured data to identify effective digital instruction and using data to provide effective data visualization in the form of dashboards up to review of return on investment (ROI) from using learner data and using learner data to ensure credentials are meet for certification programs across multiple institutions.
Technology should collect and support data within courses and ideally feed into databases that can be reused within the course, externally from the course-for-course redesign (Paredes et al., 2020) or to feed into a larger learning ecosystem (Gordon et al., 2020)(see Figure 2). To modernize courses and enable information sharing, learning technologies must be able to collect and output learning data. Several data standards are already in use with xAPI providing a popular method for capturing, standardizing, and sharing human performance data.
Growing evidence supports the use of technology interoperability. Long et al. (2015) investigated interoperable system performance for unstabilized gunnery simulators. The goal was to improve the efficiency of the adaptive training curriculum on a virtual simulation training system. They found a significant reduction in the amount of time to train with comparable final qualification scores. The Army Research Laboratory developed Pipeline, which is a Microsoft.NET dynamic link library that enables simulator vendors to wrap around their systems to be able to generate and consume xAPI activity statements (Long et al., 2015). Like the result found by Murphy et al. (2016), a nearly 40% reduction in time spent training on Basic Rifle Marksmanship was found. This was mainly due to acceleration in the curriculum. However, in this study the participants were cadets from a local ROTC and not actual military trainees. Furthermore, both studies addressed only a stove-piped learning episode (i.e., across multiple learning episodes), as both implemented adaptation in a single learning experience (Smith et al., 2018). Smith et al. (2018) stated that ideally, these adaptations should be applied within and across learning and development episodes.
Technology Interoperability at iFEST. The conference had nine presentations on technology interoperability. These included talks on transiting to higher levels of interoperability such as moving from older SCORM systems to modern xAPI systems and retrofitting standard classroom training into more technology supported and interoperable environments. Presentations also focused on more detailed talks explaining higher level standards that have been set forth for implementing interoperable networks such as the Total Learning Architecture.
Any learning organization is only as good as the governance set forth to oversee its operation of its learning ecosystem (Walcutt & Schatz, 2019). Policy is one of the key issues that must be set to guide the process of good governance (Giattino & Stafford, 2019). These policies establish who among multiple constituencies are responsible for establishing and enforcing policy across the organization. The policies also guide change within the learning ecosystem by setting acceptable guidelines for evaluating performance (Berk, 2013; Giattino & Stafford, 2019; Hai-Jew, 2006) and providing flexibility that allows change within the organization without fear of reprisal (Craig, Li et al., 2020).
Policy at iFEST. The iFEST conference had four excellent presentations over policy in the current year. Two of the talks presented policies on xAPI implementation at a higher level by the advanced distributed learning initiative and a more applied level within the U.S. Navy. The other two talks were excellent examples of public transparency of policy with a public review and comment session on the NATO Advanced Distributed Learning handbook and a consideration of stakeholders in a talk that discussed integrating learning engineering into a team.
Annual Timely Topic
The annual topic this year was “Living and Thriving in the New Normal.” This topic was in direct response to the drastic shift toward e-learning during the COVID-19 pandemic. This has become a widely documented phenomenon that has impacted most if not all instances of training and education (Soni, 2020). For example, a quick search on Google Scholar with the key terms of COVID-19 and e-learning shift provided 2920 articles reporting how the shift occurred that range from K-12 to adult learning organizations from almost every discipline of learning and numerous countries.
Annual Timely Topic (Living and Thriving in the New Normal) at iFEST. The annual timely topic did not disappoint. Nine interesting talks provided guidance and lessons learned for the breakneck speed within which most learning and training organizations work. These talks provided practical guidance from best practices for recording success and developing creativity in using new technology to specific guidance on useful technology, such as how to switch between in-person, online, and blended learning, and extending reality with new technology such as augmented reality and virtual reality.
In the words of Abraham Lincoln (1989), “we know nothing of what will happen in future, but by the analogy of experience” (p. 50). However, dealing with the COVID pandemic has taught us is that past experience does not always provide the best analogy. That is why leaders within training and learning organizations must be prepared to understand the state of the art in modern learning ecosystems (Craig, Li et al., 2020; Walcutt & Schatz, 2019). Having a firm foundation in modern learning ecosystems is essential for creating innovative learning organizations that can quickly respond to new challenges. Attending and presenting at conferences such as iFEST is a unique opportunity to understand the cutting edge of modern learning ecosystems and to identify the people that are moving the area forward.
Advanced Distributed Learning Initiative. (2021). iFEST 2021 call for ideas. https://adlnet.gov/news/2021/01/15/iFEST-2021-Announcements/
Allen, I. E., & Seaman, J. (2013). Changing course: Ten years of tracking online education in the United States. Babson Survey Research Group and Quahog Research Group LLC.
Berk, R. A. (2013). Face-to-face versus online course evaluations: A “consumer’s guide” to seven strategies. MERLOT Journal of Online Learning and Teaching, 9(1), 140–148.
Chuang, C. Y., Craig, S. D., & Femiani, J. (2017). Detecting probable cheating during online assessments based on time delay and head pose. Higher Education Research & Development, 36(6), 1123–1137. https://doi.org/10.1080/07294360.2017.1303456
Craig, S. D., Li, S., Prewitt, D., Morgan, L. A., & Schroeder, N. L. (2020). Science of learning and readiness (SoLaR) exemplar report: A path toward learning at scale. Arizona State University. https://apps.dtic.mil/sti/pdfs/AD1104999.pdf
Craig, S. D., & Schroeder, N. L. (2020). Science of learning and readiness (SoLaR) recommendation report: Science of learning practices for distributed online environments. Arizona State University. https://apps.dtic.mil/sti/pdfs/AD1105006.pdf
Craig, S. D., Schroeder, N. L., Roscoe, R. D., Cooke, N. J., Prewitt, D., Li, S., Morgan, L. A., Paredes, Y. V., Siegle, R. F., & Clark, A. (2020). Science of learning and readiness state of the art report. Arizona State University. https://apps.dtic.mil/sti/pdfs/AD1106590.pdf
Craig, S. D., Zhang, S., & Prewitt, D. (2018, September). Deep reasoning for enhancing etextbooks (DREE): Using deep-level questions for guiding learning. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 62(1), 341–345. https://doi.org/10.1177%2F1541931218621079
Daniel, D. B., & Woody, W. D. (2013). E-textbooks at what cost? Performance and use of electronic v. print texts. Computers & Education, 62, 18–23. https://doi.org/10.1016/j.compedu.2012.10.016
Enfield, J. (2013). Looking at the impact of the flipped classroom model of instruction on undergraduate multimedia students at CSUN. TechTrends, 57(6), 14–27. https://doi.org/10.1007/s11528-013-0698-1
Giattino, T., & Stafford, M. (2019). Governance for learning ecosystems. In J. J. Walcutt & S. Schatz (Eds.), Modernizing learning: Building the future learning ecosystem (pp. 317–338). U.S. Government Publishing Office. https://adlnet.gov/publications/2019/04/modernizing-learning/
Gordon, J., Hayden, T., Johnson, A., & Smith, B. (2020). Total learning architecture 2019 report. Advanced Distributed Learning Initiative. https://adlnet.gov/publications/2020/04/2019-Total-Learning-Architecture-Report/
Graesser A., Hu, X., & Ritter, S. (2019). History of distributed learning. In J. J. Walcutt & S. Schatz (Eds.), Modernizing learning: Building the future learning ecosystem (pp. 17–42). U.S. Government Publishing Office. https://adlnet.gov/publications/2019/04/modernizing-learning/
Hai-Jew, S. (2006). Operationalizing trust: Building the online trust student survey (OTSS). Journal of Interactive Instruction Development, 19(2), 16–30.
Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. Routledge.
Kulik, J. A., & Fletcher, J. D. (2016). Effectiveness of intelligent tutoring systems: A meta-analytic review. Review of Educational Research, 86(1), 42–78. https://doi.org/10.3102%2F0034654315581420
Lincoln, A. (1989). Speeches and writings, 1832-1858: Speeches, letters, and miscellaneous writings: The Lincoln-Douglas debates (Vol. 45). Library of America.
Liu, Q., Peng, W., Zhang, F., Hu, R., Li, Y., & Yan, W. (2016). The effectiveness of blended learning in health professions: Systematic review and meta-analysis. Journal of Medical Internet Research, 18(1), 1–19. https://doi.org/10.2196/jmir.4807
Long, R., Hruska, M., Medford, A. L., Murphy, J. S., Newton, C., Kilcullen, T., & Harvey, R. L. (2015). Adapting gunnery training using the experience API. In Proceedings of the Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) (pp. 1–12). National Training & Simulation Association.
Ma, W., Adesope, O. O., Nesbit, J. C., & Liu, Q. (2014). Intelligent tutoring systems and learning outcomes: A meta-analysis. Journal of Educational Psychology, 106(4), 901–918. https://doi.org/10.1037/a0037123
Means, B., Toyama, Y., Murphy, R., & Baki, M. (2013). The effectiveness of online and blended learning: A meta-analysis of the empirical literature. Teachers College Record, 115, 1–47.
Moore, M. G., & Kearsley, G. (2011). Distance education: A systems view of online learning (3rd ed.). Wadsworth Cengage Learning.
Muilenburg, L. Y., & Berge, Z. L. (2001). Barriers to distance education: A factor-analytic study. The American Journal of Distance Education, 15(2), 7–22. https://doi.org/10.1080/08923640109527081
Murphy, J., Hannigan, F., Hruska, M., Medford, A., & Diaz, G. (2016). Leveraging interoperable data to improve training effectiveness using the Experience API (xAPI). In D. D. Schmorrow & C. M. Fidopiastis (Eds.), Proceedings, Part II, of the 10th International Conference on Foundations of Augmented Cognition: Neuroergonomics and Operational Neuroscience (Vol. 9744, pp. 46–54). Springer-Verlag. https://doi.org/10.1007/978-3-319-39952-2_5
National Center for Education Statistics. (2020). Fast facts. https://nces.ed.gov/fastfacts/display.asp?id=80
National Center for Education Statistics. (2021). Table 311.15: Number and percentage of students enrolled in degree-granting postsecondary institutions, by distance education participation, location of student, level of enrollment, and control and level of institution: Fall 2018 and fall 2019. https://nces.ed.gov/programs/digest/d20/tables/dt20_311.15.asp
Papamitsiou, Z., & Economides, A. A. (2014). Learning analytics and educational data mining in practice: A systematic literature review of empirical evidence. Journal of Educational Technology & Society, 17(4), 49–64.
Paredes, Y. V., Siegle, R. F., Hsiao, I. H., & Craig, S. D. (2020, December). Educational data mining and learning analytics for improving online learning environments. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 64(1), 500–504. https://doi.org/10.1177%2F1071181320641113
Roehl, A., Reddy, S. L., & Shannon, G. J. (2013). The flipped classroom: An opportunity to engage millennial students through active learning strategies. Journal of Family & Consumer Sciences, 105(2), 44–49.
Smith, B., Gallagher, P. S., Schatz, S., & Vogel-Walcutt, J. (2018). Total learning architecture: Moving into the future. Proceedings of the Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) (pp. 1–11). National Training & Simulation Association.
Soni, V. D. (2020). Global impact of e-learning during COVID 19. Campbellsville University. https://dx.doi.org/10.2139/ssrn.3630073
VanLehn, K. (2011). The relative effectiveness of human tutoring, intelligent tutoring systems, and other tutoring systems. Educational Psychologist 46, 197–221.
Walcutt, J. J., & Schatz, S. (2019). Modernizing learning. In J. J. Walcutt & S. Schatz (Eds.), Modernizing learning: Building the future learning ecosystem (pp. 3–16). U.S. Government Publishing Office.
Scotty D. Craig is an associate professor of human systems engineering at Arizona State University and director of the Arizona State University Advanced Distributed Learning Partnership Laboratory. He has a dual affiliation with the Ira A. Fulton Schools of Engineering and the Mary Lou Fulton Teachers College. He obtained his PhD in cognitive psychology with a focus on learning from The University of Memphis. Craig has expertise within cognitive psychology, usability, and the science of learning with a focus on development and evaluation of learning technology.
Back to Top