Tuesday, January 25, 2011

Posted on other blog on this

Readings worth to kick-start an understanding of a concept and theory. I've written a bit on this in my PhD journey blog at: http://shazz-pkm-okm-framework.blogspot.com/2011/01/readings-worth-to-kick-start.html

Thanks, George, for sharing such simple to understand and worthy reading! Appreciate it.

Now I'm more interested in the topic,
- Shazz @ LAK
25 Jan 2011

Monday, January 24, 2011

Week 3 Video to Review: Metaweb

Sharing the video that is clear and easy to understand... Check this out!

Slowly turning into Gen Y?
- Shazz @ LAK
24 Jan 2011

Saturday, January 22, 2011

Similar to SNAPP, probably better!

There are more functions and features extended from the capabilities of SNAPP, found in other applications. George gave a long list of similar applications to this "SNA", which I need to try out when I'm free later on...
Re: SNAPP look-alikes
by George Siemens - Tuesday, 18 January 2011, 02:37 PM
Hi Bert - in your request for SNAPP-look alikes, I assume you are referring to tools that map social networks. If so, there are numerous options available (some free, some not):

Netdraw: http://www.analytictech.com/netdraw/netdraw.htm
Pajek: http://vlado.fmf.uni-lj.si/pub/networks/pajek/
NodeXL: http://nodexl.codeplex.com/
Netminer: http://www.netminer.com/NetMiner/home_01.jsp
Touchgraph: http://www.touchgraph.com/navigator
Gephi: http://gephi.org/
UCINet: http://www.analytictech.com/ucinet/


The reason we went with SNAPP for the week 2 activity is that it's a simple browser plugin and doesn't require manually loading data. It demonstrates social network analysis without needing the techical skills of the tools listed above.

Bert also shared one application as well, called SaNDVIS - coincidently it has the features I suggested in my previous blog, such as tags/keywords that make the threads/connections, and such... Marvellous!! B-)

Thanks, guys! Really worth it joining this MOOC! ;D
- Shazz @ LAK
22 Jan 2011

Friday, January 21, 2011

Discussion on SNAPP

Honestly, I was looking for this tool to generate the 'nodal network' to analyse the knowledge sharing over my social network. Yet, this 'webpage embedded analytics tool' - SNAPP - is basically compatible and purposely programmed for educational LMS, such as Moodle and Blackboard to name a few.

What is the benefit of SNAPP?
In terms of learning analytics, it is of no doubt a good tool to understand 'who learns from who', 'who picks up the trail from who', 'who follows or has interest on who/whose topic', and so on. It's about people and their activities that makes the network. It is something that I can make use of in my class in terms of identifying 'leaders' in class who have their own group of classmates that understand their styles of 'sharing knowledge'.

I find this (tool to identify leaders/facilitators) useful for programming (or highly technical) subjects, where students who help each other understand the technicality of programming language could assist me to guide the rest once they understood the concept I teach - so that I can concentrate on the other rest who needs my further attention to catch up with the topic taught the class. I experience this a lot in physical computer lab classes, especially when I have to cover certain topics in the syllabus and at the same time I have to monitor that everyone is catching up; and at each different lab session there could be different 'leaders' who voluntarily assist me in facilitating others, either in explaining the topic (how to program) or debug the code for their friends.

So these are the benefits I find from SNAPP - identifying prospect leaders who can be prospect MOOC initiators in their own environment/universities/communities.

What additional functionality is required?
Functionalities that may be valuable to add on can be:
  • Keywords of discussion that the people pick up from each thread that makes the connection - it can show how a topic can disperse or branch out into new topics, and how/what the people actually understand from the original aim of the syllabus.
  • Time gap between one person to another - it can show how long a person takes to catch up and what he/she may have missed that cause them to rely on those at the end of the thread.
  • In real physical class scenario, students/learners may tend to catch up at the end of the 'semester', so they may not follow much in weekly basis; a function that could tell who may be a bit left behind - different colour or something?
  • If it is logic that having more connections means you're a good learner, then maybe a function that shows the ranking in terms of number of connections (with other attributes/parameters) could assist in analysing the prospect "good" students.
  • Understanding the patterns of connections and what learners have learnt may help in structuring the assessment better.
Again, I still hope that I'm in the right track of our LAK topics.

4 more weeks to catch up!
- Shazz @ LAK
21 Jan 2011

Falling for SNAPP

Only when I read Dave's blog (http://davecormier.com/edblog/2011/01/20/mooc-newbie-voice-week-2-big-data-must-be-important-its-big/) that I realised, "What? We supposed to use SNAPP this week?" Now, how did I miss that? ;D

Let me share with you what I've discovered from using SNAPP before I post another to answer George's questions...

I had the first try on one of the discussion topic started by George in Week 2, which I joined. As per my understanding, I need to go into a topic that I've contributed. So the first result is as follows:

The second try on SNAPP is on the Week 2 main list of discussion topics. I've hovered the mouse on my 'node', showing details on my 'data':

I got quite excited that I wanted to try on the Week 1 list of topics, since it has more responses from members of our MOOC. By this time, I hardly can see my node even though I had more than 2 posts in that week:

I saw a link from SNAPP to NetDraw, and statement saying that we can export the data in VNA format into NetDraw application. So I downloaded NetDraw to install it, copied the text format in SNAPP into a notepad file and save it as a *.vna file, and open the *.vna file in NetDraw application to get this result:

Believe it or not, I shouted "Yippee!!" when I saw the *.vna file really gives the graphical view like the one in SNAPP, just by opening the 'text' file! ;D Interesting!

Amazing at technology, amazing at discovery made,
- Shazz @ LAK
21 Jan 2011

LAK11 Orientation Presentation

Just sharing here the course outline and pedagogy, for LAK11.

This may be useful if we want to practice similar method for our classes. I'm considering this myself already, at this moment.

Trying something new is always a good 'learning' experience,
- Shazz @ LAK
21 Jan 2011

A short break...

To my readers (if there are any)...
Let me explain my purpose of having this blog, so that everyone can understand that it is not just about reviewing what I've learnt from LAK11.

If you noticed so far, I copied all my reviews and put them in here, merely for some reasons:
1. To remind me about my own thoughts, so that it's easier for me to catch up when I need to at later stage.
2. Surprisingly, it could be useful for writing analysis in my own research paper - it happened before, especially reviews on articles read. I don't have to re-read and rethink of how to review anymore, but merely 'pick and lay'.
3. To show how I can 'connect' to the topics with my own experiences, with hope that others (i.e. readers) could learn something from it.
4. To bookmark important sources and points of my understanding, of course, and for backtrack purposes.
5. For my own 'learning analysis', if not analytics. ;D

Once I have a blog to share my thoughts in this topic, hopefully it's a long lasting one, with many more shares on LAK to update after the course ends.

All the best to us in LAK11!! ;D
- Shazz @ LAK
21 Jan 2011

Another comment posted on somebody's blog

Among the recommended reviews is this one by Wolfgang Greller, at

Basically, he reviewed on the aspect of experiencing MOOC. Again, I feel that I'm in the same boat. So here's my comment:

Shazz Says: Your comment is awaiting moderation.

I totally agree with you, especially the last part. I myself thought that I can use my old method of strategising my way in accessing the information from the MOOC, so I (by habit) created a blog for this, even followed George in creating the NetVibes, and etc.

But I still find that it’s too wide and diverse, that I still have to depend on my skimming skill (if I have the time), my time (to go through the email notifications) and the facilitator’s review (and from there I would rather read the reviews posted by other learners instead of recommended readings by the facilitator!).

As you said, I least worry about all these, because as long as we know where they are and how to ‘search’ for it later on after the course, I think we’re quite save to catch up at later stage. The important thing is to contribute via reviews, because that would ensure our understanding, at least for our own relief.

- Shazz, Kuala Lumpur

Finding own paths in MOOC,
- Shazz @ LAK
21 Jan 2011

Thursday, January 20, 2011

A video worth a view...

I remember citing Peter Norvig's writing in my research paper, but only now I know he's from Google! Wow wee! ;D

Thanks for sharing this, LAK!
- Shazz @ LAK
20 Jan 2011

Comment on Sheila's Blog

One of the recommended feedbacks/reviews by our facilitator is a blog post by Sheila:

I didn't really read through her post yet, but I understand from Mark's comment that one of the points mentioned by Sheila was about getting herself adapted to MOOC - which is what everyone is experiencing even though we're already in the end of Week 2! Well, welcome to the club! ;D

So here's my comment (which is at this moment, still in waiting line for approval by Sheila):

I agree with Mark. It's quite tough in the beginning, because of the 'panic' of not really knowing what we need to know in order to follow the topics... But thanks to previous experience in following George's teaching in 2008, I'm more confident in 'leaving behind' some of the texts/articles recommended to read in the study weeks.

In fact, I find it more comfortable reading others' reviews on the texts, and then go into the texts to 'understand the gist' myself and compare it with the reviews. At least some of the main ideas are already covered by the reviews, so we don't really miss much if we skim through and spot other points that interests us.

One thing for sure, what our eyes spot as interest might be a different angle of understanding compared to others. Interesting! ;D (even though it sounds like we tend to get lost too. LOL)

- Shazz, Kuala Lumpur

Sincerely, a learner,
- Shazz @ LAK
20 Jan 2011

A quick response to one of the topics in Week 2

Answering the discussion topic by Xavier Ochoa:
AK vs EDM vs Educational Research
by Xavier Ochoa - Wednesday, 19 January 2011, 04:10 AM
Today at the very interesting talk of Ryan Baker a question arise about the differences (and similarities) between Educational Data Mining, Learning and Knowledge Analytics and the traditional field of Educational Research. I think that this question deserves further exploration in this course.

The definitions that those terms have in Wikipedia are very similar:

Educational Research:
Educational research refers to a variety of methods in which individuals evaluate different aspects of education including but not limited to: “student learning, teaching methods, teacher training, and classroom dynamics”.

Educational Data Mining:
Educational Data Mining (called EDM) is an emerging discipline, concerned with developing methods for exploring the unique types of data that come from educational settings, and using those methods to better understand students, and the settings which they learn in.

Learning (and Knowledge) Analytics:
Learning analytics is the use of intelligent data, learner-produced data, and analysis models to discover information and social connections for predicting and advising people's learning.

Is there a difference? Do you think that EDM and LAK are just an evolution of Educational Research, but with better tools and data, or there is something radically different and unique in the new approaches?

Do you think that EDM and LAK are synonymous or there is a meaningful difference between the two fields? Should we merge or we should keep them separated?

Let's discuss.

My review was as follows:

I do believe that both LAK and EDM are from the areas of Educational Research even though not directly; more on technicalities in terms of data/information, methods of analysis. They are the shared areas between Educational Research and other technological research, such as artificial intelligence and of course data mining.

So far, this is what I understand from the 'research areas' that makes LAK (as shown in attached diagram).

- Shazz
Kuala Lumpur 10:30PM

Just started to 'follow' Week 2 activities,
- Shazz @ LAK
20 Jan 2011

Dave's Review on Week 2 activities

Can't quite find a way to 'follow' Dave's blog from here, so I just bookmark the link from here so that I can refer to it when I'm free.

Thanks for the recommendation, George! I like Dave's way of review too. Clear!

Currently busy with PhD proposal defense preparation,
- Shazz @ LAK
20 Jan 2011

Wednesday, January 19, 2011

My review entailing Viplav Baxi's comments

I like the style Viplav explained on the difference between metrics/numeric measurement, and paths and patterns in analytics. I was trying to explain it in words for quite some time but I just couldn't, even with a clear vision in my head on how it looks like...
Re: What about learning analytics in the corporate sector?
by Viplav Baxi - Sunday, 16 January 2011, 05:49 PM

One of the areas that greatly interest me is simulations for corporate training needs. I have done work in the IT and Financial sectors that show the power of simulations to bring together some complex information about how a learner navigated a simulated job situation. Scores are too simplistic to do justice to such complex tracking of learner progress and competence. Consequently, learning & knowledge analytics become more complex as well.

For example, let us consider a scenario that has multiple decision points (connected like in a graph) and multiple paths to the correct outcome. Let us assume that there is an ideal path (not hard to imagine in a highly disciplined process training). A learner's decision making trail or actions trail could be compared to the ideal path/trail and analytics could be programmed to infer from deviations to get a better and more comprehensive picture of learner performance. (also not unlike the notion of knowledge analytics being used to compare competency levels in a discipline).

If you know cricket, you would be familiar with a graph that shows runs scored vs overs for both teams with circles denoting fall of wickets, resulting in what are popularly called "worms" that deviate from each other on the graph as the match progresses.

Point is, these analytics move from being comparisions between numbers (Peter spent 5 more minutes than Pan on the google group), to being comparisons and analytics based on patterns and paths.


My comments:

Thanks, Viplav... Your explanation really put my understanding in words, quite clearly (about the patterns and paths versus numbers).

Tailing this, I wonder:
- How can this pattern be presented to senior management (assuming they are not at 'our level of understanding') in a form that they appreciate?
- How can we be certain that the paths the employee takes in getting to the final point are something they learnt and contributing to the final result, or merely a waste of time until they found the right 'nodes'?
- How should the evaluation be designed to make it fair for every employee (because some people do take time to understand after a long-winded paths, etc...)?
- Does this mean an experienced 'wanderer' could achieve the KPI better than newbies? I don't think so too.

There's still further way to go after this, and I believe corporate sector is very particular about 'measurement' in evaluation.

Hope I'm in track with my points,
- Shazz
Kuala Lumpur 1:33AM

Still wonder if I'm in the same track,
- Shazz @ LAK
19 Jan 2011

Extracting the gist - Learning Analytics in Corporate Sector

Let me digest some points from the discussion that is going on in LAK11 MOOC this week. My review is yet to come, maybe tomorrow.

I like the points clearly outlined by Adam Weisblatt on this topic:
- Corporate training rarely has grades, but performance reviews and business outcomes. [KPI, as Tanya Elias mentioned.]
- Analytics is most important for decision making at the senior level.
- Prediction would be most helpful in creating development plans.
- Corporations are interested in the value of social media but they want to be able to track the advantage of its use because they pay for it in one way or another.

Skimming through the rest of the discussion thread, it makes me wonder how easy academicians wander out of topic (slightly) and back into the realm of education... as they started talking about grades, accreditations, and such, that they think corporate world should have certain standard for. To me, it's not at all about grades - because grades also mean that the person (or graduate) may or may not be practically useful at work. Well, long list of that from where it's coming from!

I love this part, since "lifelong learning" is also the 'aim' and mission of my organisation (private university in Kuala Lumpur):
"In the UK, the term ‘Lifelong Learning’ is applied to any learning that is undertaken after leaving formal education. In theory online corporate training would fall into this category but, to me, it does not fit well due to the simplistic methods used (in order to assure assessment) and because it does not address the acquisition of the ability and skills needed to apply the knowledge. Lifelong learning is more about knowledge based upon experience (both one’s own and that of colleagues) and therefore bound up with skills and ability. The trick is, understanding how to capture this lifelong learning in a way that is meaningful to employers (current and future). I am hoping that learning analytics may supply some of the answers." (Peter Condon, 15 Jan 2011)

Let me rest my case for a while,
- Shazz @ LAK
19 Jan 2011

Monday, January 17, 2011

Week 2 Readings

As expected, Week 2 would have longer list of readings. I almost couldn't see the end of the list! (Hahaha!!)

Here to bookmark the syllabus for this week, for my later visit and ponder:

Wonder if I have time to read..., at least half of these?
- Shazz @ LAK
17 Jan 2011

Review on Critics of Learning Analytics

As usual, I woke up to see long list of inbox messages from the LAK11 course. Out of the long list or replies from every member, I merely picked 1 to read, and George's new topic to venture. After all that, I still find George's questions more inviting, and thus I answered...
Critiques of learning analytics?
by George Siemens - Sunday, 16 January 2011, 05:50 PM
What are your concerns with analytics when applied to learning and knowledge? What types of critiques and concepts should we explore/consider?

I've started with a few quick thoughts on the topic here: http://www.learninganalytics.net/?p=101

My honest response is as follows:

I think I've mentioned before in one of my replies or reviews that this learning analytics is becoming more and more of 'education management' area. Yes, we can still use it to measure performance and understand the situation in classroom scenes, but what is the point if only a few classes (or facilitators) use it, and the result of overall system cannot be tabulated?

I guess I'm talking on behalf of 'small lecturers' in an establishing university that is yet to understand and realise the power of learning analytics.

My concerns are more on measurement, whether the results meet the initial objectives we want (because we tend to 'drown' in the pool of data overflow), and what's next...

I mean, we can analyse all we want from the stats and figures we retrieve, but do we really know what to do with it, or what difference can we make out of it? Because again, this is education management level, and the results may say that "this is for registrar dept to change, not me" or "this is my faculty board's responsibility, not me".

Honest opinion,
Kuala Lumpur 6:13AM 17Jan2011

Hope to be in the right track on this,
- Shazz @ LAK
17 Jan 2011

Saturday, January 15, 2011

Review on presentation by G.Siemens

View more presentations from gsiemens.

Check out the terms like "social data", "central nervous system", "intelligent curriculum", "locational data", and how all these can be put together to produce for us the 'learning analytics'...

The following conceptual model makes sense, where learners can develop knowledge from both formal and informal (not simply being spoon-fed), body of knowledge can be made from intelligent data/system through aggregation etc, and this would qualify the learners to a degree/qualification:

The question is..., are the learners ready to go for this?
- Shazz @ LAK
15 Jan 2011

Review on Elias (2011)

From Tanya Elias' (2011) article made available at http://learninganalytics.net/LearningAnalyticsDefinitionsProcessesPotential.pdf, I have some ideas summarised here...

Apart from the 'relatives' academic analytics has with other areas (i.e. business analytics, learning analytics, web analytics, to name a few), academic analytics consist of 5 steps, which aligned with the knowledge continuum of data-information-knowledge-wisdom:
  1. Capture - capture of meaningless data
  2. Report - the data being reported as information
  3. Predict - enabling of predictions based on knowledge and wise action
  4. Act - the action results from the predictions
  5. Refine - self-improvement project where monitoring the impact of the project is a continual effort; statistical models should be updated on a regular basis
One quote that caught my eye is this, which differentiate between academic analytics and normal data mining:
"If we do not re-present actions to the crowd through an interface that affects similar
actions, it is just data mining for some other purpose. This is not a knowledge
discovery cycle."
From the analysis done on models and frameworks available, 7 processes related to learning analytics have emerged:
  1. Select
  2. Capture
  3. Aggregate & Report
  4. Predict
  5. Use
  6. Refine
  7. Share
These processes somehow reminds me of Personal Knowledge Management processes that I'm working on and reading. Words such as 'aggregate' and 'share' seem to be important to ensure that there is a learning process going on in a human's mind, and this is no difference with the concept of analytics!

Check out the following table that shows the relation among knowledge continuum (DIKW), 5 steps of analytics and the rest:

Elias also shares some information on SNA (Social Network Analysis). This is an example layout of discussion forum posts and replies in a learning management system, and the same discussion as a network diagram using SNAPP (http://research.uow.edu.au/learningnetworks/seeing/snapp/index.html):

What the 'network' can tell us would be very diverse, and it is all up to us and our objectives to use it wisely in terms of analytics.

This article ends with a compilation of the discussed items in a model/framework, shown below:

Hope to find learning analytics of good use in my research area of interest,
- Shazz @ LAK
15 Jan 2011

Review on Goldstein (2005)

I just skimmed through Goldstein's (2005) article as suggested for LAK11. Honestly, this is the second time I've skimmed through, without really concentrating on the details the author is trying to point out. Check out the article at: http://net.educause.edu/ir/library/pdf/ecar_so/ers/ers0508/EKF0508.pdf

Overall, Goldstein has very good findings from real-case examples, and some are related to the findings I would like to have in my own action research. The only problem is the access to this. I believe this is more related and will interest researchers on education management instead of small lecturer like me, since the academic analytics can allow the researchers to view the overall statistics that concern more on students activities, marks, registrations, administrative aspects, and such.

The author has clear outline of what to do in researches such as this, and this can be applied to any case scenario (even if it's not higher learning institution) with further customisation of needs properly defined.

Hope to have clearer mind after few more readings,
- Shazz @ LAK
15 Jan 2011

Reviews by Schawn on Week 1 Readings

Schawn did a remarkable job in reviewing the readings for Week 1 of our LAK course. He focused on Ms Elias' article, which in turn provides points that make more sense to my understanding! Check out his blog at: http://sethropp.posterous.com/learning-analytics-explain-week-1-of-lak11

In return, I've placed a comment on his blog, as stated below:

Thank you, Schawn, for the good review on Week 1 readings.

I would like to point out the 4 points you have summarised in here, especially item number 2 - "During the design of the learning experience how can the data be captured in a non-intrusive way. If you cannot design and develop the capturing this data in a non-intrusive way, how can you design it in a way where the student will participate in providing enough quality data."

It is true that it is not easy to get students to participate in the first place, even though they spend most time on the same tools for personal use. They somehow do not 'trust' the sharing of info through the tools, even if it benefits themselves. With that as a problem, another consequence is in getting 'quality data'. So far, I only manage to get less than 10% of the student number to participate actively and this also subsides in time. It's quite tricky to get good stats out of this kind of action research, without students' cooperation. They believe in 'open' source only for their personal benefit (taking) and they don't really bother or believe in giving back to the community (giving). There's always a lack of give-and-take in this situation. So how to go about it?

Defining goal is important, but then again, the action itself requires a lot of "promotion" and "negotiation with rewards" in order to get others participate and provide quality data. Not easy. If there's any way to sort out these 2 points, before getting reliable and valid results for the other 2 points, it would be wonderful...

- Shazz
Kuala Lumpur

Hope to read more 'sense-making' reviews from participants of LAK11! ;D
- Shazz @ LAK
15 Jan 2011

Friday, January 14, 2011

Learning Analytics with Stats

Check this out, a slide by John Fritz... Interesting stats included.

More slides to share, I hope...,
- Shazz @ LAK
14 Jan 2011

Learning Analytics

Let me share this slide, which is quite straight-forward in applying the theory/model to the real-life situation. Check this out!
Another one coming up...,
- Shazz @ LAK
14 Jan 2011

A bit on PLE

As I was skimming through the blog suggested for reading, for the course of LAK11, I bumped into this:

This from Scott Leslie. This is clearly not ONE technology that allows you to work with others… the middle section of the diagram are about the physical location of the learner (say desktop), and then they extend from there. They include technologies, people, relationships, events, from a ‘personal’ perspective.

Definition – This is a Personal (as in, me the person) learning environment
(as in, the ecology in which I learn)

More details are available at: http://davecormier.com/edblog/2010/10/10/disaggregate-people-not-power-part-two-now-with-more-manifesto/

Hoping my students can take up and learn from here too,
- Shazz @ LAK
14 Jan 2011

Bookmarking in here...

I feel the need to 'bookmark' some links here, for my later visit. These are the resources and reflections from LAK11.

Dave's blog with summary of activities from Week 1:

Analysis on Week 1 participation in MOOC, done by our facilitator, Tanya:

This is NetVibes by George, with 'nuggets' that summarises all interactions done on the topic of LAK11:

I will add my review on the rest of the readings from Week 1, once I manage to catch up over the weekend... In the meantime, I need to check out all the recordings too:

- Shazz @ LAK
14 Jan 2011

Tuesday, January 11, 2011

The gist of the first reading

The following are the gist my eyes could catch while skimming through the journal by Baker, S.J.D., Yacef, K. (2009) The State of Educational Data Mining in 2009: A Review and Future Visions: http://www.educationaldatamining.org/JEDM/images/articles/vol1/issue1/JEDMVol1Issue1_BakerYacef.pdf

My short review is laid out at the end...

The Educational Data Mining community website, http://www.educationaldatamining.org/ , defines educational data mining as follows: “Educational Data Mining is an emerging discipline, concerned with developing methods for exploring the unique types of data that come from educational settings, and using those methods to better understand students, and the settings which they learn in.

Another name for data mining is Knowledge Discovery in Databases (KDD) - it is the field of discovering novel and potentially useful information from large amounts of data [Witten and Frank 1999].

Baker [in press] classifies work in educational data mining as follows:

  1. Prediction
    - Classification
    - Regression
    - Density estimation
  2. Clustering
  3. Relationship mining
    - Association rule mining
    - Correlation mining
    - Sequential pattern mining
    - Causal data mining
  4. Distillation of data for human judgment
  5. Discovery with models

Student models represent information about a student’s characteristics or state, e.g. the student’s current knowledge, motivation, meta-cognition, and attitudes. Modeling student individual differences in these areas enables software to respond to those individual differences, significantly improving student learning [Corbett 2001].

As Bartneck and Hu [2009] have noted, Google Scholar is the most comprehensive source for citations – particularly for the conferences which are essential for understanding Computer Science research.

Recent years have also seen major changes in the types of EDM methods that are used, with prediction and discovery with models increasing while relationship mining becomes rarer. How would these trends shift in the years to come?

Educational data mining methods have had some level of impact on education and related interdisciplinary fields (e.g. artificial intelligence in education, intelligent tutoring systems, and user modeling).

Basically this journal talks about the literatures written in the topic of Educational Data Mining (EDM), and how the trend in research has shifted from one aspect to another. Initially, the method used for EDM was more on relationship mining, but towards the later stage the researches are using more of prediction and discovery with models as methods.

One part that interests me in this article is the impact shown on EDM on related fields such as Artificial Intelligence (AI), which is what I'm into in this past 1 year. In some ways, I have a 'hunch' and prediction that I may be doing some research on this area with the connection to AI.

Start the ball rolling,
- Shazz @ LAK
11 Jan 2011


Learning analytics

“The measurement, collection, analysis and reporting of data about learners
and their contexts, for purposes of understanding and optimising learning and
the environments in which it occurs” - Learning Analytics 2011 Conference site: https://tekri.athabascau.ca/analytics/

Digesting slowly...,
- Shazz @ LAK
11 Jan 2011

Playing around with Hunch

The first exercise for Week 1 is to try out Hunch site, and answer some questions in the course forum...
Playing around with Hunch
by George Siemens - Sunday, 9 January 2011, 08:24 PM
If you created a Hunch account (week 1 activities: http://learninganalytics.net/syllabus.html#Week_1 , share your reactions with others - were the Hunch recommendations accurate? What are the educational uses of a Hunch-like tool for learning?

In response to this exercise, here is my review:
As the name says: "Hunch"...

Q1: Hunch recommendations - accurate?
I wish to say that it's merely suggesting, out of the analysis done across different aspects of data it retrieves. I noticed that the Books recommended by Hunch are more of those related to the 'idols' or 'people who influence me' in my FB, instead of my choice of answers in the questions.

Even the TV shows it recommends are based on the "type of series" shown, which are family weekly series (probably influenced by my list in FB) instead of my usual preference of movies or non-series base. Recommendations are good, no doubt, but not really what I would go for. I mean, "Star Trek" and the "Big Bang Theory"? Come on... I know I idolise Stephen Hawkings but that doesn't mean I would prefer to watch such series - it's totally dirrent aspect of idolising a person.

Q2: The educational uses of a Hunch-like tool for learning?

One way I could figure, regardless whether it's accurate or not, is the fact that I could predict the type of students I would face in my class. It's something I would do in my own physical class - e.g: I asked my class today (first class with full attendance this sem), of their programme background, just to know what kind of audience I'm facing so that I can relate to them later in my class with my examples, in order for them to understand better on my teachings.

I guess Hunch can be applied in the same manner - not necessarily to be accurate all the time, but averagely acceptable to kickstart a whole new venture of knowing the people you're dealing with in learning and teaching.

Oh ya, it's also about trust. But then again, we can't rely fully on the analysis of Hunch to trust it more than the learners/colleagues. If I put myself in my students' shoes, I would probably believe and rely on data and suggestions given by Hunch to decide whether to trust the student 'next to me'. But as an adult learner and teacher, and also non-digital native, I believe that technology is merely the art of humans, so why must you really rely on it without venturing personally yourself to know for sure whether to trust the person or not.

As mentioned earlier, Hunch predictions/analysis is merely to kickstart whatever you want to do (or decide) next... It's like doing research - it may (or may not be) start from your own "hunch", with some facts lying around in your head, which needs to be sorted out in order to make it more justified and makes more sense.

As a lecturer who is known to be people-person, I believe that Hunch can be used to know another person in order to ease later communication, conveying of message, and setting boundaries to areas within the scope of understanding of the audience/others.

Hope I'm not off-track in my answers. ;D
- Shazz
Kuala Lumpur Time: 2.43AM

Sharing what is learnt, in the process of learning what is shared,
- Shazz @ LAK
11 Jan 2011

Week 1 on LAK11

Check out the slide shared last night (KL time) as our online-conference orientation. I missed it the live presentation, but I have the slide here and I'm listening the audio recording (in MP3) right this moment... Cool!

From this slide, there are some models and frameworks that interest me, especially the one in the last slide. Marvelous!

Even though this is a free online open-sourced course, the number of participants are tremendous... beyond my expectation. So far, I only notice myself to be from Malaysia.... Asia, in fact! Hope the rest of the journey be a smooth sailing of learning something new.

The trick is just to keep an open mind, "skim and dive in",
- Shazz @ LAK
11 Jan 2011

Sharing from Siemens'

As my first blog under the realm of Learning and Knowledge Analytics, let me share a slide prepared (and shared) by a wonderful facilitator in e-learning, George Siemens from Canada.

This slide will prepare your mindset for the next venture of Learning and Knowledge Analytics, which I will share more in this blog.

Hope we can learn something from this humble blog, and enjoy the ride... ;D
- Shazz @ LAK
11 Jan 2011