Contextualizing learning through the Ecology of Resources (EoR) framework

Ecology of Resources Design Framework (EoR)

The EoR is a framework used to investigate and consider the forms of resources a learner currently obtains, and subsequent resources to help build their understanding. Applying the EoR model requires exploring the learner’s context in various ways (interview, ethnographically, observation) to reveal their available internal and external resources, such as knowledge, environment, tools and people, and the relationships between them.

Following this would be determining other resources and more able partners (MAPs) that would help move the learner from their present state of understanding to a more developed one. The space between the two states would be considered zones: zone of proximal development (Vygotsky), proximal adjustment, and available assistance. See diagram below about these zones taken from: http://eorframework.pbworks.com/   The placement of assistance would be to have supports that are just-in-time and fading in order to scaffold learning.

Comments

This is a complex framework and model to help explore the context of the learner in order to design useful, supportive and  customized/personalized learning experiences. It continues in understanding the relationship between resources, such as the use of technology and available apps or WiFi connection, and other affordances.

The framework also identifies both positive and negative filters in a learner’s context, which are taken into consideration when developing learning supports. This could be a mentor, organizational structure, workshop, rules, etc.

While developing learning using such a framework could be potentially overwhelming and complex, I believe it offers a deeper look at the many influences and resources surrounding a learner. I like the concept of considering the relationship between resources – this peaks my interest, and has me wonder about multi-layered and dynamic situations, tools and influences affect learning based on and for real-life scenarios.

A good case study about exploring EoR for language learning can be found at: http://www.slideshare.net/joshuau/context-connections-designing-a-vocab-app

Learning analytics should include measuring change

I have been mulling over the use of learning analytics as a research method to determine if and how students are learning when engaged with online learning tools. At the latest Learning Analytics and Knowledge conference in Banff, Alberta, Canada, it occurred to me that one of the most troubling concepts for presenters was to determine what to measure when applying learning analytics. I agreed when some presenters stated merely measuring the number of log-ins, hits or postings did not provide an accurate indication of whether students were learning. For instance, in online learning environments (formal and informal) I don’t log on that much. When I do I download or link readings/materials and learn on my own more rather than with others. Once I feel I have a grasp of the content or ideas I log back on and engage somewhat near the end of the timeframe alloted. I learn in social and collaborative settings, but not as much as lauded by those promoting the idea.

The point is I don’t believe we can measure learning based on online activity alone – we need to include and bridge it with assessment. More important, learning is believed to be recognizable by a change in behaviour, thinking and attitude.; in short, we change and grow through learning and this needs to be captured, measured and examined to determine the success of students.

Capturing data to measure change in learners poses some problems. Ideally having pre- and post-tests would provide one way to measure change and growth, but it’s not possible to structure all learning activities or assessments that way. Other possible ways to measure learning is the level of achievement through various assessments, and through feedback from students on perceptions of their learning (via surveys, etc.). However, perceptions and knowledge acquisition are not synonymous. Determining evidence of  change in learners to use in learning analytics will be challenging but necessary in order to gather essential data.

Aside from evidence of change and growth, online learner interaction and engagement would provide data on their actions that perhaps attributed to their learning. I would suggest to move beyond using tracking data provided in a LMS and also include interactions in external learning environments and tools, such as blogs, wikis or social networks sites. I envision this data would include  frequency statistics (hits, posts) and network visualizations (connections).

Lastly, it would be important to determine the resources learners are accessing. It has been realized in higher education that instructors are not the sole holder of knowledge, and the net provides endless sources of materials, resources and expertise. This might be more difficult to determine, but perhaps reviewing the cited works within students’ work might provide the resources they used outside those offered in the class.

Triangulating this data will be essential to understand how students engaged online and how that contributed to their learning and growth. Examining the correlations of this data would indicate if and which online activities and resource use affected learning and which did not.

This is a rudimentary understanding of the use of learning analytics in education. I am about to engage in a research study where I will attempt to apply this analysis method. Hopefully over time I can develop my thinking and share new ideas about this topic along with my experiences from using learning analytics as an educational research method.

Emerging research method: Learning Analytics

Recently I attended a conference in beautiful Banff, Alberta  on The First International Conference on Learning Analytics and Knowledge 2011 (LAK11), hosted by Tekri a division of Athabasca University. I was inspired by the people I met and learned from. I met peers such as George Siemens, Dave Cormier, Terry Anderson, Rory McGreal, Grainne Conole, and David Wiley. Additionally, I met a number of others who were statisticians, data analysts, programmers, educators, and innovators.

Below are my collective notes on the subjects discussed at the conference on the emerging research and analysis method, Learning Analytics. Please note this is an emerging concept and the notes reflect the array of perceptions and ideas by a wide range of people. It reflects the newness and potential for this process. (Please correct me if I misunderstood any notions.)

In subsequent posts I will provide my views on the method and its potential, challenges, and discrepancies in hopes to contribute to a scholarly discussion.

Notes:

  • Overview
    • Learning analytics is analysing triangulated data (mashed-up data) about learners and their context, learning, and environment. It’s strategic analytics for administration, academics and context.
    • It is about attempting to map student comprehension within a specific domain. It draws on pragmatic analysis within the marketing field and analysis of consumer behaviour. The steps for learning analytics are collecting, reformatting or coding, cleansing and analysing (visually as well) the data. It is a form of data mining for representative information of learners and analysing it in automated ways.
    • The semantic web is a way to connect personal information about someone and determine patterns of behaving, knowing, and learning. However this requires bringing meaning to written words and creating ontologies to connect them. It’s a daunting task with the challenge of misinterpreted and subjective meaning. Ontologies could include those for content structure and type, user models, learning design, and domains.
    • Learning Analytics Goals
      • Mapping learner interactions and actions using multiple ontologies and theories
      • Testing learning theories and approaches
      • Choosing analysis types
        • Best to do predictive and optimization analysis versus descriptive and forecasting (Davenport & Harris)
      • Monitoring learning stages
        • Tracking student records and successes from high school to postsecondary
      • Determining learner support needed
        • Determine interventions to help students, recommenders, intelligent tutors
      • Revealing student patterns
        • Student course and resource choices (web tracking)
        • Association between media and roles
        • Student attention
        • PLE usage
        • Impact of group and collaborative work; communal view
      • Providing gap analysis:
        • Student progress compared to learning outcomes ontologies
      • Reviewing learning Outcomes
        • Analysis of 21st century skills
        • Richness of dialogue (Mercer)
        • Student-rated discussion posts
        • Correlate online activity with grades
        • Correlate resources use with grades
      • Socio-technical affordances: technology, self and others
      • Effects of networks on workload, role, and learning
      • New directions:
        • New light or perspective on a phenomenon
        • New or altered predictive or causal models
        • New views on learning
    • Learning predictors:
      • 21st century skills
      • granularity of learning
      • comprehension
      • student behaviour
      • expert vs novice behaviour
      • learning outcomes
      • interaction with others
      • participation patterns (SNA)
      • social isolation
      • use of learning tools/widgets
      • time on task
      • resource use
      • assessments of others on student work; community opinion
      • quality of discourse
      • effects of keynote speaker on discourse
      • effects of media on interaction
    • Learning environments analyzed:
      • Social networking sites
      • PLEs
      • LMS
      • Web2.0 tools
      • Laptops
    • Data sets:
      • hash tag used and connections (topic of interest and connections)
      • delicious tag used and connections (need semantic meanings)
      • twitter count of a url
      • student records/successes from high school to postsecondary
      • connection of student learning to other data (vendors, etc.)
      • discourse analysis using semantics, connections, and types of discourse (Mercer)
      • patterns of actions and behaviours to implement interventions
      • descriptive data: log in count and times, resources chosen
      • triangulation of website, course, library and online activity
      • social connections of learners
      • nested model: cluster behavioural data with social practices; combine quantitative data
      • merge interests, social connections, and resource use (how people learn)
      • dimensions involved in learning object use
      • contextualized action model: acts dependent upon another; semantic connection
      • merging concept analysis with social analysis
      • merging content analysis with topic modelling
      • institutional data for cross sectional and cross tabulation analysis
      • group analytics
      • automated discourse analysis

      Data analysis tools and methods

        • ROLE  Responsive Open learning environment – tracking of widget use
        • Recommenders: based on resources used, learning actions, etc.
        • Contextualized Attention Metadata (CAM): track online movements
        • Google Analytics: website traffic
        • Google Social: public connections of web users
        • Screenscrape: reformatted published data from the web in HTML format
        • Any data sets in usable formats: CVS or Excel data
        • Yahoo Pipes: filter web content into site and analyze; data needs reformatting
        • Many Eyes: IBM visualisation tools; existing data sets and visualisations available
        • Gaffy mapping tool?
        • Google Refine: cleanses data
        • Freebase: data sets to mash-up; restructure to another model
        • Stanford Data Wrangler: restructures any unstructured data
        • bit.ly: analysis of connected web links
        • Python: API for Google Analytics to parse reports
        • Various dashboards of visualisations
        • Selection of shared metrics and algorithms
        • Gaffe: links among networks
        • Google Trends: traffic insight on favourite websites
        • SNAPP: social network visualisation
        • Lecture capture: analysis of viewer actions
        • Latent semantic analysis (LSA)/discourse analysis and interaction analysis
        • Social network analysis (SNA): visualization of social relationships in a network; analysis of nodes and ties
        • Game theory: capture and predict behaviours in strategic situations; dependency on others
        • Virtual machines tracking student actions; full computer systems
        • Wakoopa: track website visits
        • RescueTime: tracks time spent on tasks
        • LMS analysis: student activity and grades tracked in learning management systems
        • AAT (Athabasca U): new software, academic analysis tool for Moodle and any LMS; student engagement, tool use and outcomes
      • Questions about Learning Analytics
        • Who accesses and can access data?
        • Who needs the analysis?
        • Who knows about learning analytics?
        • What are the theories or models underlying analysis query?
        • How will others react to the analysis?
        • What new lens/perspective can be applied?
        • What minimal data is needed?
        • How use the results?
        • What interventions would be best?
        • What is learning and how measure it?
      • Challenges
        • Ability of data to show learning
        • Defining learning and learning ties
        • Defining drop out for online learners (these 3 are my interests)
        • Gathering and analysing data from multiple platforms
        • Developing infrastructures to collect, analyse, interpret and report data
          • Placement and access of infrastructure
          • Data management and technology
          • Student and faculty centred tools; useable interfaces
        • Valid and reliable data
          • Fragmented data on online learners
          • Quality of data
          • Working with unstructured data
          • Better data sets
          • Operational definitions of constructs
            • Expert opinion
            • Theory based
            • Data driven
          • Defining context in word meaning
          • Analyze causal rather than predictive data
          • Invisible network of participation
        • Security and privacy issues; being transparent
        • Dynamic vs. static analysis models; predictors constantly changing
        • Cultural considerations/differences
        • Automating discourse analysis
        • Merging big data (quantitative) and qualitative data: triangulation
      • Data Science Team Members
        • Stakeholders/users
        • Data scientists
        • Programmers
        • Statisticians
        • Visualiser
        • Learning scientists/instructional designers
        • Data evaluators
        • Information technologists
        • Interpreters
        • Project manager

Promises and challenges of learning analytics

I joined a MOOC (Learning Analytics and Knowledge – LAK11 with George Siemens, et al.) looking at learning analytics as a means to gather data about student and institutional performance and needs. This emerging research method has promises of reaching for far and wide for clusters of data and analyzing them in a systematic way, simultaneously, spontaneously, and immediately – thus, supporting data-driven decisions in the moment.

In my exploration for robust research methods to see if and how students are learning with technologies, I latched onto this course and upcoming conference.

In the initial week, preliminary readings were given (basic description and educational data mining). From what I have learned, this method has promises of gathering data from various systems – such as student database, social networks, online courses, institutional records, etc. The point is such data is already available and can further be merged and analyzed using numerous analysis  methods, such as predictive and relational applications.

I had two reactions. First, what are the possibilities? Can we gather data about student attitudes and interests through social networking sites, their interaction in online learning platforms, their use of online library systems, etc. and blend the data to discover who they are, what they want, and how they learn along with their progress? In my opinion, this would require a paradigm shift in quantitative research thinking that insists on controlled variables and focused outcomes.

This leads to my second reaction: how will we be able to logistically mix the variables and constructs of the various data types? Will we be able to find a way to compare student preferences (from likes and dislikes on Facebook), with chosen modes and frequency of communication (email, LMS, smartphones), with grades and course evaluations?

This sort of blows my mind, but I am game to find out more. It really has potential to bring together hoards of data, presently available due to technology, and make sense of it.

However, I think we have a long tough job ahead of us to make this type of research analysis valid and reliable. But, I’m in!

Researching learning with technology, Part 3

Graham Attwell (ed.) (2006) offered a comprehensive paper on various types of evaluation methods for e-learning. With a focus on European educational institutions, he and his collaborators discussed principles, methods and history of evaluation in the educational field. Methods and models ranged from evaluating policies, learning, and program impact with focuses on management, consumer, pedagogy etc. The paper provides a good shopping list for evaluators and researchers.

One particular model they provided was for the evaluation of  learning and teaching processes in virtual environments. This model was developed and tested as part of an EU project, E-VAL3.  On page 35 of the document, they offered a working table that combined two established aspects of e-learning. One segment draws on the work of Jonassen, Peck, and Wilson (1999) and their principles for learning with technology (ICT). For instance, learning with ICT should include activities that are active, constructive, reflective, intentional, authentic conversational, and interactive. The other segment they use is from the work of Paulsen (1995) and the essentials of effective computer mediated communication, such as organizational, social and intellectual functions.

Here is the combined evaluation table/chart to be used while observing or assessing teaching and learning in a virtual environment:

I believe this evaluation tool has some promise and focuses on key elements to be included in e-learning experiences. However, it doesn’t seem to evaluate more elementary components of learning such as knowledge acquisition, as described by Bloom, or student perception.

As well, there was little explanation on how to use the tool or analyze the results. The paper implied input into the tool would be based on high inferences by the evaluator, but did not offer ways to overcome bias. It seems the model was tested further and refined.

However, looking for further documentation on this evaluation tool, I found the main project has been dispensed and little writing on the method was found after 2006. As well, the main website (http://www.evaluate-europe.net/) no longer functions.  A deeper search might be needed.

Researching learning with technology, Part 2

I am still exploring ways to better study the impact of technology on learning, as with online and blended learning. A few weeks ago I presented some of my ideas and queries through a CIDER presentation.  In that session I shared that I was not completely satisfied with my doctoral study methodology or much that I have seen presented in the literature. There needs to be a stronger mix of qual and quant methods that can capture the impact of learning with technology, such as the study of:

  • human-computer interaction
  • the brain (cognitive learning)
  • the senses (visual, design)
  • system and social dynamics

EDUCAUSE has recently created a initiative to find ways to explore the ‘evidence’ that would reveal the impact of technology as they, too, are unsatisfied with current research methods.  They state:

” With many options and constrained budgets, faculty and administrators must make careful decisions about what practices to adopt and about where to invest their time, effort, and fiscal resources. As critical as these decisions are, the information available about the impact of these innovations is often scarce, uneven, or both. What evidence do we have that these changes and innovation are having the impact we hope for?”

Some of their preliminary readings/lit focuses on developed learning theories and the use of technology. I think this is a good start and feel we must not through out the baby with the bath water. Regardless of our intent to claim something new and innovative we need some established framework from which to work. People are people and they behave and learn in certain ways; albeit, they are shaped and affected by the evolving environment and technology.

Back to my notion to get more technical with research data. A new discussion is emerging  about creating systems the collect and interpret data on the resources used and activities performed by learners/users, whether informally or formally. The point is to help mine data for users due to the overwhelming amount of information and resources available on the net. Even the most information literate person struggles to bring together, peruse, and decipher the multitude of info. I can hardly keep up!

Creating a data collection and interpretive system would hone information and provide recommenders (i.e. what to read or explore next based on past activities). The group at Athabasca/TEKRI with George Siemens (sense making and learning analytics) and the group at NRCC with Stephen Downes (Web X) are exploring this concept.

I am excited about the possibility to use honed and interpreted data to study the processes and needs of learners who use technology. Learners are no longer in a bubble but have extended learning environments (PLE), social networks, and vast amounts of web-based resources. How are they using these artefacts and systems? How can we help them use it to learn? How is their learning developed with them, or not developed? Such questions are taking on new meaning compared to olden days due to advanced information, communication and networked technologies.

I hope to attend the learning analytics conference  to learn more about this research method. Until then, I’ll keep exploring potential methodologies to study the impact of technology on learning.

Best research methods when using tech_part 1

I am on a quest to find valid, rigorous and sophisticated research methods to explore the impact of using technology in learning. Many research studies in education lean towards case studies and qualitative methods. I think we need something more robust with informative constructs in order to examine any shift in learning when using technology, such as with blended learning formats. I want to examine cognitive changes versus personal experiences with participants. As Dr. Martha Cleveland-Innes of Athabasca University recently suggested at this year’s CSSHE conference in Montreal, we need more research that focuses on causation, not explanation.

 My blog will become my learning diary as I unfold some stones and consider different methods.

A couple of years ago I reviewed and wrote about a book on researching technology education, edited by Middleton (2008 ). One particular research method that struck me was by Lars Bjorklund. He used a repertory grid technique to chart and analyze the “elicit underlying, often tacit criteria that professional teachers use when they assess creative work” (p.46). Using polarized criteria or design elements, such as ugly and beautiful, he uncovered through interviews essential criteria for design work as seen by experts. I would think something of this accord could be used to examine the thinking of students and then compare this to their thinking and constructive building of knowledge after using technology.

In a recent ALT-J issue,  Nie et al. shared their experience with a research study on the role of podcasting in a graduate program. They were interested in how uploaded podcasts would benefit campus-based students. Their method was similar to design-based research, though they called it action research. After collecting data through surveys, student blog entries and discussion board postings, curriculum meetings, pre- and post-interviews, and eventual analysis, changes were made to curriculum and the use of podcasts. This research was spread over 3 terms of the course that used podcasts giving it the rigor of a longitudinal study.

Terry Anderson of Athabasca University also supports design-based research,  a method advocated by Thomas Reeves of the University of Georgia. Design-based research is quite pragmatic focusing on real world problems and directly impacting the researched phenomenon. It blends learning and design principles with innovative learning environments to improve learning. In short, it relies on theory and a variety of research methods to improve a way students learn with and are supported by technology.