I was teaching Grade 9 Business and the students had time experiencing Data Processing, Accounting, Marketing and Office Practice. It was in my Accounting section where I gave this question …
“Why do you use pen when you do bookkeeping?
And one memorable response was …
“Because if you used a pencil, people could arasit”.
Sound it out.
Recently, Alfred Thompson, Peter Beens and I had a chat about this.
Peter offered an assignment that he used with his students related to the concept. It’s linked here.
As computer science teachers, we kind of nerded out on how to interpret this. In my mind, it’s kind of easy since, while it’s encrypted, exactly all of the letters are in each of the words. There are only so many combinations and you could run them against a dictionary.
What if there were words like “arasit” though?
I’ve been hanging on to this article for a while now thinking about the implications.
As far back as I can remember, this has always been the promise of technology. It’s exciting to think that a computer could do the assessment/marking and leave more time to the teacher to do the actual teaching and class observation.
I remember a science department where all of the marking was done on bubble cards. The philosophy was that assessments should be completely objective. Of course, there’s the questions themselves. We know that there really is an art to creating questions.
Not all assessments can be distilled to this format though. And, they never should be. Students should be equally assessed with their abilities to communicate and create original responses and that doesn’t come down to choose a, b, c, or d.
That’s why I’m particularly intrigued about the concept in this article. What would it actually take to write a piece of software that could assess an essay or a piece of artwork or a computer program, for examples? They’re substantially more sophisticated.
From my perspective, it’s way in the future. But, I’m not brave enough to say never. Given what’s possible with Artificial Intelligence today, it’s not inconceivable that it could happen as the science matures.
Just not anytime soon. It will be interesting to follow this competition and see what turns out.
what’s the difference between a Home Mini and a Next Mini?
how old is Justin Bieber?
Such are the conversations between my Google assistant and me. To be honest, it’s not a very exciting back and forth. At this point in time, it’s more of an amusement than a real change in the way I do things. It’s fun but I haven’t reached that sense of revolution yet.
There’s always this nagging feeling that I could and should be doing more with it. It’s a lot like me with a television remote control – I don’t just want to know what’s on but what elseis on.
Fortunately, rather than trial and error, there’s a wonderful resource site at:
We read so much about how AI is the future. At times, the things that pop into the news seem so far fetched that it’s easy to write off as the sort of thing that only the truly geeky can appreciate.
Then, something comes along that brings things down to our level – you know the sort of thing that you can experiment with on your own personal computer and get some interesting results. This makes you think that maybe there’s something about this for the future after all!
The tutorial talks about a three step process – Gather, Train, Export.
Gather is kind of fun sitting here in my computer area all by myself. I’m not sure that I’m ready to do this in front of the family.
Train is at the heart of it all. Once you have gathered enough information, you get the opportunity to see how your model recognizes new things.
Export lets you take the results of your hard work (or seemingly hard work since I was learning more than just working the tutorial) for purposes beyond the tutorials.
I’m not sure that I can claim to be all that more of an expert in this as a result but I sure learned a great deal about this Teachable Machine. The site isn’t one to leave you to fend for yourself. There are tutorials to help along each step.
Not surprisingly, since it’s a “withgoogle” project, there is a great deal of YouTube video to support you. Above and beyond the actual work, there are articles that will let you do some more in depth reading. In particular, the Ethics article was interesting.
If you’re looking for a product to dabble with in your classroom, you need to check this one out.
At the Bring IT, Together Conference last week, I had slotted aside half a day to spend in Tim King’s session on security. As it would happen, unfortunately, the voicEd Radio show was being recorded at the time so I had to miss it.
I’m not sure whether or not presenters got a registrant list so I hunted him down over lunch to let him know of my absence. He didn’t seem to be too disappointed (maybe it was the birthday gift I’d given him last summer) but then he indicated that I needed to go to this other session in the afternoon. It was given by Gordon Alexander from IBM. He was going to talk about IBM’s Watson and augmented intelligence. He also had good IBM swag to give away. I like swag.
So, I went and thoroughly enjoyed things. While Mr. Alexander had to step aside for a couple of minutes at the start, Tim begin the session and, of course, we had to get up to speed with Watson and its work on Jeopardy.
I’d seen it before but was interested all the same. Then, Mr. Alexander came back and delivered the message that was the heart of the presentation. Then, we got a chance to try it ourselves.
All of us in the group were invited to create a free account on Watson and it was smart enough to reject this flood of requests from the same location! So, we switched off the wifi on our phones and created the accounts on our own data. Success.
The activity that were to take part of was to create our own chatbot. Tim’s TEJ class had had the same presentation as we did and they did things like an interactive pizza ordering bot. That would definitely be of interest to students and so we were off. The interface was very much like flowcharting from years ago before flowcharting kind of went away for programming as we shifted away from a procedural paradigm. It seemed very natural and fluent for me and I was plugging away when Tim offered an already created product that he thought we should explore. Diversion time.
It was called Personality Insights and the claim was that Watson would take your content from Twitter and make observations about you as a user. I guess it makes sense since when you post to Twitter, you don’t have any expectation of privacy. So, we gave it a shot. We also found out that the source is on GitHub.
By this time, Tim’s wife Alanna had come along side of me. She wanted to see my results. Only if I could see hers! She grabbed her tablet and off the two of us went off to explore.
Thankfully, I had a fully qualified teacher-librarian sitting beside me to explain some of the things that got analyzed. Gregariousness? It was interesting. A couple of people had tweeted out the direct link and others who saw it were checking it out. It’s too bad because they would have lost the context of being in the workshop.
But, the bigger thing was to go to Watson’s page and the personality insights page to get the bigger picture of the program/product. It added a great deal to what I think about when I talk about advertising and business. Overtly, who hasn’t chatted with a bot on a car dealership’s website? What is happening that we don’t know about?
Very quickly, the 2.5 hours was over. We hadn’t had a break because we were so engaged with Watson and the discussions surrounding it. I found the conversation about guidance departments and student support with products like Watson very interesting. Maybe even a bit scary. What happens if Watson is wrong?
So, thanks, Tim for steering me in this direction for that afternoon. I really enjoyed it.
And, I got a pair of blue IBM ear buds to take home with me. I wonder if they’re “powered by Watson”?
From Rolland Chidiac, a couple of posts describing how he’s using sports software in his classroom. There’s a great deal of gaming and application of the concept throughout these posts.
In the first post, Rolland describes how he used a piece of software in his class with a bent towards going beyond the game and making a connection to Mathematics. All games have some way to keep score of your progress in the game. That’s what they’re all about. How else would you know if you won or not otherwise? In this case, Rolland’s students capture their scores and create their own collection of data to analyse. It reminds me of the time clocks kept during Formula 1 auto races. Then, based on this data, students predict how they could make their scores better.
In the second post, Rolland gets a little bit constructive with an unused Xbox from at home that arrives at school. The Xbox is an amazing device; I recall once at a Microsoft event where some students from Seattle had created a Fish Market simulation.
Rolland’s post, in this case, gets a bit technical about how he actually sets things up in his classroom. He provides an interesting list of the expectations that he has in mind.
I wonder if there might be a third post where the focus turns to students writing their own games.
I think that you’ll really enjoy and be impressed with the project described in this post. Fair Chance Learning partnered with Seven Generations Education Institute to create virtual reality content to help preserve community traditions. Personally, I spent a great deal of time doing some background about the Seven Generations Education Institute so that I could truly appreciate what was happening.
The context is the Fall Festival which traditionally celebrates how the Anishinaabe prepare for the winter. The post talks about:
Like years before, local elders, volunteers and SGEI staff demonstrate wild rice preparations, tell traditional stories, sing at the Grandfather drum and cook bannock on a stick. Unlike in years passed, Fall Harvest 2019 incorporated virtual reality to help preserve these vital cultural teachings and enrich the education of our students.
I’m intrigued and will try to follow the results from this initiative. Hopefully, Fair Chance Learning will document it all on their website so that other communities can enjoy the benefits and do something on their own in other locations.
If you’re going to the Bring IT, Together Conference, there might be an opportunity to see this first hand?
So, here the other two-parter from Irene Stewart and her work at St. Clair College.
In the first post, she talks about those favourite teachers and how they become a model for you. It was quite an easy process for me and two teachers most certainly sprang to mind. I have no question about how they were models for me as I became a teacher. I felt badly though because what I remember most is their lecturing approach. I like to think that my classrooms were more of an activity based environment. I do struggle to think of activities from these teachers. I know that they were there but they’re not what first springs to life.
The second post describes a pilot and then ultimately an implementation of a course, THRIVES, that delivers on the awareness of the college environment and what it will take for students to be successful. The numbers she describes blew me away. 1 000 students in a pilot. Then nine sections with a total of 6 000 students.
There is an interesting reflection about activity in the course and the quizzes involved in the timeline throughout. It’s a nice reflection and I’m sure helped move her thinking as she went from pilot to implementation.
I check in with Lynn Thomas periodically as she works her way through the alphabet and shares her insights on the work she chose. In this case, the word is “Patience”.
Is it a requirement to be a teacher?
I’d go further than that … it may, in fact, it may well be the best attribute that any person who aspires to be a teacher should have. After all, as teachers, we absolutely know the content. The students, not so much, or not at all. Success comes as a result of the transfer or attitude, knowledge, and skills.
Every students proceeds at his/her own speed. There is no one speed fits all. The best teachers recognize this and exercise patience to make everyone successful.
Education, as we know it, can be counter to this at times. We have defined times for courses and grades and an assessment has to be given whether you’re ready or not.
As I noted last week, I’m not necessarily a fan of PD at staff meetings because it isn’t always applicable to all. This concept, however, would definitely be worthwhile doing.
I’ll bet this post gives you some inspiration for thought – both as teacher and as student.
There was considerable time spent on my part on the release of this new feature that teachers can incorporate into their workflow if they are using Google Classroom.
It’s called “Originality reports” and it’s Google’s spin on checking for plagiarism in student work.
It’s not the only player in the field. Just do a search in your favourite internet search engine for “plagiarism checker” and check out the results. In Ontario, another product has been licensed by the Ministry of Education.
The genesis of that licensing happened about the same time that online learning became popular. The logic was that since you couldn’t see the student face to face working on a project, there was no way that you could guarantee that the works submitted were her or his.
I remember discussing the product with our eLearning teachers at the time. Their response was pretty negative. Particularly those that were teaching English, they claimed that their professional judgement was better than any program. By working with the student for a semester, they were able to identify writing styles and literacy skills and could see it grow throughout their time together. Consequently, while the licensed product was available to them, none of them said that they had used it.
Now, to be honest, this was long before there was an “app for that” mentality for computer users. It would be interesting to have that conversation today.
Collaboration is something that I promoted when teaching Computer Science. Granted, we didn’t write long essays but I’d argue that any programmer develops their own individual coding style much like writers develop a writing style. When there were times when I questioned original work, it was a matter of sitting down next to the student and have her or him explain the program to me. Between that and my insistence on written documentation for each problem, I think I did an OK job of making sure that things were original.
Like many of the other products, Google starts off by promoting this product as an aid to help students submit their best work. In the next breath though, the article indicates that Google has access to billions of resources online. That makes sense – that’s one of the things that it does best. So, it’s not a huge leap to make the claim that work that isn’t properly cited is easily identified.
We live in a day and age of privacy concerns and Google addresses it in the announcement, claiming that the student work remains the student’s. Unless of course, they blog about it! But the announcement also indicates that there is a plan to expand this to creating a repository of past assignments for checking things already submitted at the school.
I think it’s going to be an interesting follow to see the success of this product.
will die-hard users of other products make the switch?
will it only be available in Google Classroom?
if a teacher was hesitant to use another product because of professional judgement, will they try this one?
will a demo at the beginning of the semester frighten everyone enough that it’s not needed throughout the course?
how will parents react to their child’s work being used by a Google product?
how many submissions for conference presentations will be focused on promoting this tool?
I’d be interested in reading your initial reaction to this product. Are you in or out? Why?