Emerging research method: Learning Analytics

Recently I attended a conference in beautiful Banff, Alberta  on The First International Conference on Learning Analytics and Knowledge 2011 (LAK11), hosted by Tekri a division of Athabasca University. I was inspired by the people I met and learned from. I met peers such as George Siemens, Dave Cormier, Terry Anderson, Rory McGreal, Grainne Conole, and David Wiley. Additionally, I met a number of others who were statisticians, data analysts, programmers, educators, and innovators.

Below are my collective notes on the subjects discussed at the conference on the emerging research and analysis method, Learning Analytics. Please note this is an emerging concept and the notes reflect the array of perceptions and ideas by a wide range of people. It reflects the newness and potential for this process. (Please correct me if I misunderstood any notions.)

In subsequent posts I will provide my views on the method and its potential, challenges, and discrepancies in hopes to contribute to a scholarly discussion.

Notes:

  • Overview
    • Learning analytics is analysing triangulated data (mashed-up data) about learners and their context, learning, and environment. It’s strategic analytics for administration, academics and context.
    • It is about attempting to map student comprehension within a specific domain. It draws on pragmatic analysis within the marketing field and analysis of consumer behaviour. The steps for learning analytics are collecting, reformatting or coding, cleansing and analysing (visually as well) the data. It is a form of data mining for representative information of learners and analysing it in automated ways.
    • The semantic web is a way to connect personal information about someone and determine patterns of behaving, knowing, and learning. However this requires bringing meaning to written words and creating ontologies to connect them. It’s a daunting task with the challenge of misinterpreted and subjective meaning. Ontologies could include those for content structure and type, user models, learning design, and domains.
    • Learning Analytics Goals
      • Mapping learner interactions and actions using multiple ontologies and theories
      • Testing learning theories and approaches
      • Choosing analysis types
        • Best to do predictive and optimization analysis versus descriptive and forecasting (Davenport & Harris)
      • Monitoring learning stages
        • Tracking student records and successes from high school to postsecondary
      • Determining learner support needed
        • Determine interventions to help students, recommenders, intelligent tutors
      • Revealing student patterns
        • Student course and resource choices (web tracking)
        • Association between media and roles
        • Student attention
        • PLE usage
        • Impact of group and collaborative work; communal view
      • Providing gap analysis:
        • Student progress compared to learning outcomes ontologies
      • Reviewing learning Outcomes
        • Analysis of 21st century skills
        • Richness of dialogue (Mercer)
        • Student-rated discussion posts
        • Correlate online activity with grades
        • Correlate resources use with grades
      • Socio-technical affordances: technology, self and others
      • Effects of networks on workload, role, and learning
      • New directions:
        • New light or perspective on a phenomenon
        • New or altered predictive or causal models
        • New views on learning
    • Learning predictors:
      • 21st century skills
      • granularity of learning
      • comprehension
      • student behaviour
      • expert vs novice behaviour
      • learning outcomes
      • interaction with others
      • participation patterns (SNA)
      • social isolation
      • use of learning tools/widgets
      • time on task
      • resource use
      • assessments of others on student work; community opinion
      • quality of discourse
      • effects of keynote speaker on discourse
      • effects of media on interaction
    • Learning environments analyzed:
      • Social networking sites
      • PLEs
      • LMS
      • Web2.0 tools
      • Laptops
    • Data sets:
      • hash tag used and connections (topic of interest and connections)
      • delicious tag used and connections (need semantic meanings)
      • twitter count of a url
      • student records/successes from high school to postsecondary
      • connection of student learning to other data (vendors, etc.)
      • discourse analysis using semantics, connections, and types of discourse (Mercer)
      • patterns of actions and behaviours to implement interventions
      • descriptive data: log in count and times, resources chosen
      • triangulation of website, course, library and online activity
      • social connections of learners
      • nested model: cluster behavioural data with social practices; combine quantitative data
      • merge interests, social connections, and resource use (how people learn)
      • dimensions involved in learning object use
      • contextualized action model: acts dependent upon another; semantic connection
      • merging concept analysis with social analysis
      • merging content analysis with topic modelling
      • institutional data for cross sectional and cross tabulation analysis
      • group analytics
      • automated discourse analysis

      Data analysis tools and methods

        • ROLE  Responsive Open learning environment – tracking of widget use
        • Recommenders: based on resources used, learning actions, etc.
        • Contextualized Attention Metadata (CAM): track online movements
        • Google Analytics: website traffic
        • Google Social: public connections of web users
        • Screenscrape: reformatted published data from the web in HTML format
        • Any data sets in usable formats: CVS or Excel data
        • Yahoo Pipes: filter web content into site and analyze; data needs reformatting
        • Many Eyes: IBM visualisation tools; existing data sets and visualisations available
        • Gaffy mapping tool?
        • Google Refine: cleanses data
        • Freebase: data sets to mash-up; restructure to another model
        • Stanford Data Wrangler: restructures any unstructured data
        • bit.ly: analysis of connected web links
        • Python: API for Google Analytics to parse reports
        • Various dashboards of visualisations
        • Selection of shared metrics and algorithms
        • Gaffe: links among networks
        • Google Trends: traffic insight on favourite websites
        • SNAPP: social network visualisation
        • Lecture capture: analysis of viewer actions
        • Latent semantic analysis (LSA)/discourse analysis and interaction analysis
        • Social network analysis (SNA): visualization of social relationships in a network; analysis of nodes and ties
        • Game theory: capture and predict behaviours in strategic situations; dependency on others
        • Virtual machines tracking student actions; full computer systems
        • Wakoopa: track website visits
        • RescueTime: tracks time spent on tasks
        • LMS analysis: student activity and grades tracked in learning management systems
        • AAT (Athabasca U): new software, academic analysis tool for Moodle and any LMS; student engagement, tool use and outcomes
      • Questions about Learning Analytics
        • Who accesses and can access data?
        • Who needs the analysis?
        • Who knows about learning analytics?
        • What are the theories or models underlying analysis query?
        • How will others react to the analysis?
        • What new lens/perspective can be applied?
        • What minimal data is needed?
        • How use the results?
        • What interventions would be best?
        • What is learning and how measure it?
      • Challenges
        • Ability of data to show learning
        • Defining learning and learning ties
        • Defining drop out for online learners (these 3 are my interests)
        • Gathering and analysing data from multiple platforms
        • Developing infrastructures to collect, analyse, interpret and report data
          • Placement and access of infrastructure
          • Data management and technology
          • Student and faculty centred tools; useable interfaces
        • Valid and reliable data
          • Fragmented data on online learners
          • Quality of data
          • Working with unstructured data
          • Better data sets
          • Operational definitions of constructs
            • Expert opinion
            • Theory based
            • Data driven
          • Defining context in word meaning
          • Analyze causal rather than predictive data
          • Invisible network of participation
        • Security and privacy issues; being transparent
        • Dynamic vs. static analysis models; predictors constantly changing
        • Cultural considerations/differences
        • Automating discourse analysis
        • Merging big data (quantitative) and qualitative data: triangulation
      • Data Science Team Members
        • Stakeholders/users
        • Data scientists
        • Programmers
        • Statisticians
        • Visualiser
        • Learning scientists/instructional designers
        • Data evaluators
        • Information technologists
        • Interpreters
        • Project manager
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s