Category Archives: Open Source


Staying aware of things is always the best advice for anyone who connects her/his computer to the internet.  We were really made aware of this over the past week with the announcement of the Heartbleed bug.  It’s scary stuff, especially when you think of how long it has been in existence and how we’ve come so accustomed to relying on the supposedly secure connection between your computer and the website that you’re visiting.

At the bottom of the wikipedia article linked to above, you’ll find a list of websites that have been affected.  The common sense approach would be to change your password on those sites – once they are patched.

Other articles offering advice include:

A really good resource for all things Heartbleed:

Today’s Naked Security Podcast offers an audio insight into what’s going on:

Users of LastPass have a built-in bit of confidence.  Just head to the Tools menu and run a Security Check.  All of the sites that you have saved in this utility are checked.  You’ll determine if the site has been patched or not, along with a recommendation to get over there and change your password if the site is ready to go.

Or, if you’re not using LastPass, they offer

And, for the truly concerned browser, the Chromebleed extension keeps an eye on the sites that you browse to and warns you before you visit.

This issue is going to take a while to resolve.  I read one report that indicated that 66% of the web could be at risk.  That’s a scary thing.  In the meantime, it’s a good idea to do some research and stay on top of what’s happening.

For the really technical minded, read some code.

And, if that’s too deep, take it in as only XKCD can describe it.

Web Browsers

As I look up and down the launcher, I see four browsers ready to go on this computer.  (Running Ubuntu)

  • Chromium
  • Epiphany
  • Firefox
  • Google Chrome

That’s certainly not a complete list.  Check out this post for alternatives.

My browser as I create this post – Firefox.  It comes as the default browser to Ubuntu and, while I’ve tried the others, it just seems to be the best actor.  I’ve become used to its extensions and I really like the Puny Weakling theme.  I guess in the big scheme of things, I’ve always been a fan of Firefox and the way that it handles things.

If I were to reboot this computer and load Windows, Firefox would be there but the default browser is Google Chrome.  If I were to grab my Macintosh, the default would be Google Chrome as well.  Why the difference?  I wish that I could point to some great technological reason but I can’t.  I think it all has to do with the extensions that are installed in each.  Plus, the little Android in the top right corner has a frown that keeps me on task.


In terms of browsers, I’m the ultimate hoarder, I guess.  I should mention that there are a couple of Operas on Windows and Macintosh and who knows what else got me interested at one time.

Right out of the box, though, Windows wants to make Internet Explorer the default.  Macintosh wants to make Safari its default.  There’s really not a problem with these browsers these days.  They just keep getting better.  And, they had darn well better be the default browser for each platform.  After all, they’re crafted by Microsoft and Apple.  They should be the best that each of those companies can produce.

Which brings me back to Ubuntu.  It doesn’t create its own browser.  In fact, it has the choice of the field and all that is available.  Conceivably, they could have worked on a port with the others if they opened up their source.  With all that is available, they have chosen Firefox.  There were rumblings that this might change this October and yet, the news is that “Firefox To Remain Default Browser in Ubuntu 13.10“.

In an internet world where applications and placement are bought and sold, what does this say about the quality of the browser?  Will the debate continue on to release 14.04?

What browser do you choose to use?  Why?  Do you have a particular brand loyalty or does it even matter to you these days?  For students, how important is it for them to know that there are choices, there are proprietary and open source alternatives?

Ontario Educator Smackdown

At least sort of…

I’ve been wanting for a while to play around with the new features of is trying to build a set of tools to help you build infographics.  These are the hottest things around and help you build and display messages and content.  I’ve been poking around; there is a nice collection of templates built around the Twitter and Facebook services.

I wanted to have fun but I didn’t want to get to the point of embarrassment of anyone so I tried and tossed out a number of different trials.  In itself, I think that’s impressive.  If I was designing and building an infographic manually, I certainly wouldn’t be able to make so many tries this easily.

What I ended up with is a tribute to two of the best Ontario educational minds.  Both of these individuals push my thinking each time that we get together and between times online.  They are both acknowledged leaders in the province and have contributed so much through their work with the Educational Computing Organization of Ontario, the Ontario Teachers’ Federation and, of course, Minds on Media.

I’m talking about Brenda Sherry and Peter Skillen.  And, there’s a smackdown for that on  It’s just a matter of providing a couple of Twitter names and let build the infographic.  Here we go!

If you have ever had the pleasure to meet these terrific people, you’ll know that it should come as no surprise that you’ve got to call this smackdown a draw.

Check out yourself for a little fun and also the Ontario Educators on Twitter list for some raw data to fuel your fun.  If you’re an Ontario Educator and not on the list, you can fill out the form here and you will be added.

Powered by Qumana

Access :: Future

I’m on a roll.

In addition to the previous post with all kinds of Open Source software, I was doing my daily read of Stephen Downes’ OLDaily.  There was one entry that particularly caught my eye.  It was a book that he ha written titled “Access :: Future – Practical Advice on How to Learn and What to Learn“.

It’s a PDF file suitable for immediate download which I’ve done.  It comes as no surprise that Stephen has licensed it under a Creative Commons license – nice.

Check out the Table of Contents.

Learning Today 

  • Introduction  
  • The Purpose of Learning 
  • Things You Really Need To Learn 
  • The Mark of Wisdom 
  • Critical Thinking in the Classroom
  • Necessary and Sufficient Conditions 
  • Not All… 
  • Educational Blogging 
  • Your Career 
  • Managing Your Blog Entry: 11 Better Tips 
  • Blogs in Education 
  • How To Write Articles Quickly and Expertly  
  • Principles for Evaluating Websites 
  • Applying Critical Reasoning 
  • How Do You Know? 
  • Having Reasons 
  • How Memory Works 
  • How The Net Works 
  • An Operating System For The Mind 
  • Personal Knowledge: Transmission or Induction? 
  • Virtues Education 
  • Free Learning and Control Learning  
  • The Science of Learning  
  • E-Learning 2.0 
  • To The School or Classroom 2.0 Advocates 
  • The Issues in Front of Us  
  • The Form of Informal 
  • Uniqueness and Conformity 
  • New Technology Supporting Informal Learning 
  • How I Would Organize A Conference 
  • What I Learned in High School 
  • My Personal Passion Trajectory 
  • My Academic Upbringing 
  • Social Media and Me 
  • Seven Habits of Highly Connected People 
  • The Reality of Virtual Learning 
  • Nine Rules for Good Technology  
  • What Not To Build 
  • Ten Futures  

The book is a collection of Mr. Downes’ thinking and original writing over a period of time.  It’s not a book that you should or will read from beginning to end.  I’d suggest that you install it on your computer or reading device and read a chapter at a time and take time to see the relevance to you.  It’s not necessary to even read it in order.

You need to download this book and read it.  You need to share it with your colleagues.  You need a book talk about this.  From the entries, great conversations and learning should ensue.

Really Smart People

Yesterday served as a testament to me that there are a lot of really smart people and I was so fortunate to be able to see them show off just what it is that they stand for.  At the end of the day, it really was a humbling experience.

In the morning, we started with a breakfast speaker.  Dean Kamen got our intellectual blood stirred by sharing stories about his efforts to bring science opportunities to the youth.  He started with a wonderful story of his attempt to bring science to the masses with the creation of a Science Centre but realized that he was missing so many students going this route.  This inspired him to found the USFirst series of robotic challenges.  It was amazing to sit in the audience and hear how this has taken off both in the US and internationally.  I really liked his understanding that youth had their heroes in sports and entertainment but nothing academically and he was about to change that through this program.  From its humble beginnings, it moved from 23 company support to championships held at Epcot Centre and GeorgiaDome.  The most impressive statistic was the $15M that students were able to garner in university scholarships last year.  Quote of the day here is attributed to President Bush when addressing students at the opening ceremonies – “It’s like the WWF only with smart people.”

Then, it was time for the entire conference to get on the bus and we had a nice scenic tour to New York University where the learning continued.  This time, it was another overview of the piloting CS Principles Course.

Paul Tymann provided a wonderful overview to his implementation of this course last year.  It was designed for those students who might not otherwise engage in Computer Science because of the fear of mathematics or perhaps they were just browsing to see if there was some interest in the discipline.  There were four big themes to the course: Nuts and Bolts, Algorithmic Thinking, Computing Systems, People and Computing.  That would indeed provide a nice introduction to Computer Science and also to societal implications.  Paul shared some of the things that worked:  Image manipulation, steganography, and “go buy a computer” and some things that didn’t:  generate a webpage, accessibility for all.  There was a great deal of interest from the audience looking to offer this course themselves when it’s finalized.  Resources are at and links there to the College Board website.

Then we were in for a real treat.  A graduate student, a regular student, and a summer intern shared some of the gaming software that they had been working on at the university.  We saw how a Kinnect system monitors movement and got an idea about how that might be used in mathematics!  We got demonstrations of Factor Reactor, Super Transformation, and NoobsVersusLeets.  Development was done in XNA and Silverlight and the whole experience was quite fascinating.  Writing the game isn’t enough but also monitor its use was important for research.  To that end, we got to look behind the curtain to see, not only the data collected by the user, but how biometric devices like a pressure seat and skin cuffs measure student reaction when playing the games, collecting 8,000,000 data points in half an hour.  The big question was how do you leverage game playing with academic games?  We talked about solo play, competitive play, and collaborative play.  In the cool department was a demonstration of the Do It Yourself Touch Table.  It was fascinating to see what you could do with plexiglass, a couple of Kinnect sensors and the ability to think outside of the box.

Competitive Factor Reactor Game

Then, it was back on the bus to enjoy the New York traffic and head to the Lincoln Centre and the finals of the Microsoft Imagine Cup.  Speaking of thinking outside of the box…

Those who were competing where set up in booths that were open for us to walk through and talk about the projects with the competitors.  The main language of conversation was certainly English but the culture was world-wide.  Students were developing software and prototypes in their quest for a better world.  For the most part, it was just heavy traffic and then an opportunity to chat with the students.  At one point, though, there was this crush of humanity headed my way… Eva Longoria was there to take in the exhibits and talk with the students.  With my phone held high, I got this fuzzy ponytail shot over the shoulder of a really rude professional photographer.

Then, it was show time!  In the Koch Theatre the students all showed up to see who the award winners would be.  I didn’t have the ability to take notes so you’ll just have to visit the Imagine Cup website for the full details.

The presentation was just as good as any awards show might well be.  Betsy and I sat way, way up in one of the top rings with just a few others.

There were some really interesting projects that caught my eye walking through the display area.  One was a helicopter like device developed by a Singapore team that, by remote control, could survey a disaster scene from on high.  It had two cameras and could be outfitted with a number of sensors (radiation, smoke, etc.) to send important information back to rescue teams.

There were two tablet applications that caught my eye in particular.  One actually did win an award in its class.  It was a system of moving PECS (Picture Exchange Communication System) to a portable device for those students who need it to communicate.  I could see a huge demand for that.  The gentleman that was showing it off was from France and the product is tentatively named after a young lady who needs it for communication.

How about a traffic monitoring system complete with dry ice simulating fog?

Then, I spent almost half an hour talking to another group from India who had a sort of working prototype.  It was a tablet computer for the blind.  Instead of a clear glass screen like we normally think of when we think tablets, it had a braille interface with mechanical pins to interact with the user.  I was blown away with the concept – I wish that they had been closer to a final working product.  Talk about thinking outside the box, er, tablet.

As was noted, these students were all winners with their ideas and implementation.  One question that I asked at every table was how they intended to monetize their product.  Responses ranged from putting out a limited free version with the possibility of a purchased upgrade to donating all the work to Open Source.

The kids are all right.  If you ever have a chance to walk the hall in future Imagine Cups, you absolutely must do so.  Yesterday’s complete results are found here.

Alfred Thompson blogs his thoughts about the day here.

Powered by Qumana

CSIT Symposium 2011

Yesterday was the annual Computer Science and Information Technology Symposium.  Held annually, it’s an opportunity for teachers of CS and IT to get together for a day of learning focussed on these subjects.  Like most conferences, it is an opportunity for learning and sharing and the conversations among attendees are at least as valuable as the information shared during the sessions.

I must confess upfront a personal bias towards the event as I am one of the organizers.  I think that it’s fair to say that the events and topics are carefully selected for relevancy and currency to ensure the best experience for all in attendance.

My notes for the sessions that I attended are nicely tucked away in Evernote so that I can make reference to them later on.  There were many opportunities to think and reflect about current trends and also to wax philosophically about where Computer Science has been as a discipline.  Some of the highlights appear below.

Morning Keynote – Douglas Rushkoff – Program or Be Programmed
I had been waiting for this presentation ever since we put the agenda together.  I wasn’t disappointed.  It was almost a shame that I was taking notes since the presentation was high energy with many key things to ponder.  Each of the attendees received a copy of Rushkoff’s book and it will be nice summer reading for me.  As the title suggests, he talks about the amount of technology and how we use it daily.  There is a significant difference between those who are passively using the technology and letting the developers determine just what and how it’s used as opposed to those who truly understand how to program and leverage that skill to make the technology truly work for them.  Rushkoff talks about the current state of computer programming and asks the audience if it’s going to take a “Sputnik moment” to realize the advances made by so many outside of the United States to change the attitudes here.  Best quote of the day was “how do you share that active participation is better than passive ignorance?”


BTW, my teacher-librarian friends, this book needs to be on your shelves.  If you’re looking for a great resource for a book talk with staff and/or kids, this would be a really engaging start.

Bootstrap: Algebraic Programming for the Middle School Classroom – Emmanuel Schanzer
I was forced to attend this session as proctor and I’m so glad that I did.  I had never heard of Bootstrap before but I sure have now and I’m motivated to dig deeper into it.  It’s free; web based with all kinds of resources and this session did force us to look at some of the things about algebra, mathematics, and programming that make it a challenge for students to learn.  We started with asking why x is a “variable” in this equation.  6 = x + 2  Bootstrap essential takes the ambiguity and jargon away from mathematics and just gets down to using it as a tool to solve and also to develop ones own code.  I learned a new concept – that of Circles of Evaluation.  I was fortunate enough to sit next to a gentleman who used Bootstrap and he helped me as I tried to fumble through some of the examples.  We had a great conversation about prefix and postfix notation.  This will definitely occupy my attention for a while.

Quick Start to Small Basic – Quick Start to Small Basic – Damian DeMarco
I had used Small Basic with my university class as an tool for the introduction to programming and was curious to see how others were using it.  As I might have guessed, the presenter was a real fan of the resources that Microsoft provides with the product.  We had a walk through of the language and resources.  It confirmed many of the things that I had already been using with the program.  In addition to the official Microsoft resource, we were shown an alternative resource at that extends the teaching resources.

Lunch was great and a wonderful opportunity to just sit and talk to folks for the hour that we were there.  I had a delightful talk with a teacher from Massachusetts, Brooklyn, and a couple from the Bronx.  Joining us was a recent graduate who was looking forward to his own classroom in September.  I hope that we didn’t scare him too much!  Over lunch, Mark Hindsbo, VP of US Developer and Platform Evangelism for Microsoft shared a few moments with the group and recognized some of the students in the audience and congratulated those teachers in the audience for their continued support of Computer Science.  Where can I get a job with a title like that?

Tips from a CS Principles Pilot: Activities, Techniques & Strategies to Help Make Computing Ideas Accessible to All Students Jody Paul
This was another session that I had to work.  We were late getting started because of the great lunchtime conversation and then I had to duck out to get more handouts duplicated and so I missed the opening where Jody set the stage for this Pilot.  Fortunately, my friend Chris was in the audience and explained to me that the pilot was all about developing ways to creating an engaging introductory course for students into the world of Computer Science. What I did understand what the approach to the course.  It would be a great challenge for many – the students set the curriculum by expressing their interests and the teaching flowed from that.  There were some great examples shared about how to get into programming without that scary introductory mathematics moments.

Part of my duties involved setup for the closing keynote so I didn’t get to attend a final break out session but then it was on to…

SPIRAL:  Combining Learning, Play and Exploration Ken Perlin
If you ever needed permission to just program for the enjoyment of programming, you were given it through Perlin’s keynote.  In fact, everyone needs to enjoy the results of his programming at his website.  His session was the perfect one to set the stage for Day Three where we will get to experience Microsoft’s Imagine Cup.  Perlin took us through many of the little applets that he had written and you couldn’t help but be inspired to try to write a few of your own.  It should be great with a summer ahead for folks.  Our setup involved a computer connected to three simultaneous displays that didn’t like switching from Powerpoint to Safari and we didn’t resolve that in time to go live but a quick workaround and I’m sure that the audience didn’t recognize it at all!  The academic part of the talk got serious about games in education and Perlin talked about the research into understanding what leads to learning and what doesn’t – how intrinsic rewards play against extrinsic rewards.  Quoteworthy here was “Computer science doesn’t just need a grammar.  It needs a literature.”  Marvin Minsky”

It was a long day of learning but it wasn’t over until the grand reception and door prizes.  On the top floor of the Faculty House, we got a chance to say our goodbyes from the balcony overlooking Morningside Park.

Alfred Thompson was also at the CSIT Symposium.  His reflections are located here.

Powered by Qumana

Spare me the Drudgery

There has been a great deal of buzz about the Khan Academy and how it might have an impact on education.  On the one side, we have seen the thoughts from people like Bill Gates on the matter.  On the other side, we read the thoughts of passionate educators.  One of the most passionate and scholarly approaches was presented by Sylvia Martinez in a series on the Generation Yes blog.  I’ve been doing a great deal of thinking about this myself lately.

As with many things, I find that it’s helpful to take a personal approach to these things and then try to extrapolate to the bigger picture.  I also don’t think it’s fair to make a judgment without spending time investigating personally.  So, it was with some interest that I spent some time poking around anonymously and then by logging in to the Academy.

If we go on sheer numbers, the size of the Academy is impressive.  There are hundreds of videos covering all kinds of content and when you logged in to work through the exercises the choice is massive.  So, as I took in the videos (which is no small feat since I don’t have the fastest internet and  get a lot of retraining), it was like sitting in the front row of a lesson watching a think aloud lesson.  As I went into the exercises, the experience reminded me of flash cards and the sorts of questions that one would find in a mathematics textbook.  You probably remember the drill – do the odd numbered questions of page 37 for homework.

In fact, it looks like a faithful reproduction of the classroom experience that I had – decades ago!  While I enjoyed the mental math exercises, I did grow weary after a short while and started to look for something else.  The accountability of having homework done was there, I suppose, as the academy does some tracking and offers suggestions when it determines that you’ve mastered things.

I then reflected on what I would call “real learning”.  There’s no question that the content here addressed the snippets well and the volume is enough to choke an educational horse.  What was missing?  In my mind, the “real learning”.  I think back to the great teachers and professors that I’ve had.  It’s tough to imagine any lesson that didn’t bring in the anecdotal comments and experiences, the application of concepts, the jokes to keep us on task, divergent learning that happened, the differentiated approach that was needed to try and reach everyone in the class.  In fact, the only time that I can remember lessons that were totally about content was during times that we had student teachers who were doing their best to cover the content and hadn’t developed the self-confidence to loosen up a bit.

I put this type of learning into a personal context.  As noted in a couple of previous posts, I’ve recently attended edCampQuinte for some personal, professional learning.  The actual content that was covered could probably have been found in a collection of videos on the internet.  But, for me professional learning isn’t just about sitting and covering content.  That’s real Educational Drudgery.  When I attend sessions, I want to talk and interact with my neighbour about the subject to be sure, but I want to have a conversation and I want to brainstorm with someone smarter than I am.

So, while at Belleville, I did learn some content.  For that, I’m grateful for the presenters who spent the time to put together and share their thoughts.  However, in addition to the topics, the learning was enhanced by all of the things that I learned just by asking and answering questions.  In addition to what I learned at the edCamp, I learned so much more about the community by taking a drive around and pestering poor Kent over breakfast about what I had seen.  The added value was in the conversation and the fact that we had driven to get there.  We explored parts of the library during breaks and some of the handiwork on display was phenomenal.  The whole experience was far more than the sum of its parts.

On a personal level, I did share some of my learning about QR Codes.  There’s nothing that makes you know your content more than having to share it with others.  Even though I had documented my learning, I dug deeper than ever before knowing that there might be some probing questions coming from the audience.  I also wanted to make the audience interact with me and push my thinking.  And they did!  Even on the drive home, my mind was spinning about new thoughts and ideas that I was inspired with by the participants.

I also started to think of professional learning without actually being there.  I could have summed up my thoughts in the form of a video and mailed it in.  I could have “covered the content” and felt like I’d contributed.  I’m glad that I didn’t.  The conversations and divergent ideas put the experience over the top.  I don’t think I could ever justify my professional learning as something that I just log into from at home and watch a few videos and do a few exercises.

In that context, I look back at the Academy.  It covers the content, to be sure.  But, if that’s all that we want from an educational experience, then count me out.  It pre-supposes that everyone has the same entry point, learns using the same sort of modality, and hopefully exits with the same sets of skills and knowledge.  I deserve better; our students deserve better.

Having said that, I wouldn’t ignore the content completely.  In fact, a short video is an interesting enhancement to a lesson.  It could provide another voice in the classroom.  It could provide a nice refresher on the concepts for a student working at home.  But, it’s not the whole deal and that’s what I’m fearful when people talk about this as the solution to all of education’s problem.

In totality, the Academy has provided a very complete menu of content.  You cannot deny that.  But, just as I learned so much more planning for my presentation, I’m sure that putting together the script and the design had the bulk of the learning at the developer end.  As I sit at my computer as a consumer, I get to relive it but I don’t get a chance to ask a question or to talk to the person sitting beside me.  Maybe the real value is to use this as a model for students researching and creating their own videos on a topic to share with the class.

Personally, I couldn’t sit through video after video and the drudgery that one way expression of ideas follows.

My First Look at Natty Narwhal

It was a dog walking Saturday morning and I decided to do a little multi-tasking.  While enjoying the park with the dog who was long overdue for a good run with the weather we’ve been having, I set my computer to do an upgrade to Natty Narwhal, Ubuntu 11.04.  With my internet provider, I know that the upgrade could take hours and hours so I elected to start the process when nobody else in the house would need our very little bandwidth.

I wasn’t terribly excited going into 11.04, I must admit.  The inclusion of Unity replacing Gnome had me a little tentative.  Unity was part of Ubuntu 10.10 Netbook Edition.  It looked great and worked like the doc from OSX – kind of – it just wouldn’t hide itself.  On a Netbook, that meant taking up valuable pixels.  With the smaller screen on my Dell Netbook (1024×576), I needed every pixel that I could get.  Scrolling or shrinking the screen size just didn’t seem to be much of a comfortable computer experience for me.  But, before I upgraded my Netbook, I decided to upgrade the installation on my Sony Vaio.  At 1600×900, I have pixels to burn.  Plus the promise of a 3D and multi-touch interface was just too much to ignore.

So, I started the upgrade and left.  Upon return, we were still downloading so it was time to do some yardwork and then return just in time to bounce between 15 seconds and 60 minutes estimated time remaining.  Eventually, all was here, installation done and rebooted to the new version.  I was impressed at the boot speed.  Could it be even faster than 10.10?  At times, I wish that I could do these things in a lab setting for true machine to machine experiences.  But, a log in and I’m good to go – or so I thought.  Ubuntu tells me that the video card that I’m using won’t support Unity.  Could it be?

Sony uses Nvidia cards in this model so I had to do the unthinkable and install protected drivers.  I was a little concerned about this as well.  With 10.10, I could install the drivers but they wouldn’t give me the full screen resolution.  Could the same happen again?  Surely, this was addressed; I know that there were many of us who had asked for a solution with 10.10.  I downloaded the drivers, rebooted, success!  I have the sweet look of Unity sitting on the desktop.  With the i7 processor in this machine, it screamed.  I’m up and running!

I really like the improvement to its functionality.  The developers have been listening.  When there’s nothing running, the launcher appears.  The moment that you launch an application, the application seizes control of the screen and the launcher goes away.  You can configure it to respond to a cursor and so mine now works like my OSX doc.  Move the cursor to the left of the screen and I’m good.  And, of course, you can pin applications to the launcher so that frequently used applications are easily accessible.

So, two swings at success.  I am very happy that the nVidia drivers work to the point that Unity is happily doing its thing.  Onward…

What I like about Ubuntu is that the complete install gives you everything that you need to have in a fully functional computer.  Plus things that were installed previously are upgraded and carried forward.  So, I’m seeing Libre Office, BloGTK, Banshee Media Player, Celestia, Shotwell, and so much more.  Of course, Google Chrome is my browser of choice (a separate download) and it synchronizes with my other instances so it’s just like home.  Interestingly, Firefox comes as the default browser.

In this day of “stores”, Ubuntu remains ahead of the crowd.  The Ubuntu Software Centre has always been just a click away to grab new titles.  However, the new search incorporated in this release brings it all together.  Clicking the Ubuntu button or pressing the Windows key opens search in a transparent window they’re calling The Dash.  Start typing or you can browse your way to the desired application.  If it’s not there, the Software Centre is integrated right into the search.  Just go ahead and download the application if needed.

How much more user friendly can you get than that?

Given the success with having the nVidia driver issue resolved, I wonder about the other nagging thing.  I installed Ubuntu on this computer in Windows using the WUBI installer.  In retrospect, I could kick myself.  Although WUBI warns that its installation may be slower than a pure install, I’ve never noticed it.  The problem that I experience was the fact that WUBI doesn’t support sleep or hibernate.  Could it be that this issue has been addressed.  I closed the lid and walked away for a bit, admiring the glowing orange sleep button that’s trademark Sony.  Alas, when I opened the lid, the machine may have been asleep but it didn’t want to restore my Ubuntu session.  It’s not an earthshattering issue, I just have to remember to shut down.  With Ubuntu, shutdown and startup speeds are quite fast so it doesn’t really hurt me.

As I kick the tires and put Ubuntu through its paces, I have complete satisfaction with what I’m seeing.  Ubuntu is fast.  Unity has matured to the point of getting it right.  As an aside, I still find the mixture of open applications and shortcuts in the launcher/doc a little bizarre but I’m at a loss to come up with a better suggestion.  The screen is so easy to read; the Ubuntu fonts are clean, clear, and crisp.  I use a slightly customized version of the Ambiance theme and it really is easy on the eyes.  So many of the actual applications that I use these days are web-based anyway so having Chrome all configured and synchronized is a bonus.  Unlike Windows or MacOS where I use the Seesmic Desktop, there is no Ubuntu version.  But, the Seesmic Web does a nice job and changing the colour scheme to black makes it mesh nicely into the Ambience theme.

In addition to the video card issue, I’m sure that there are all kinds of other features and bugs swatted in this release.  I’m anxiously waiting to see what they look like.  You can’t do an operating system review justice after just one day and I don’t pretend to do that here either.  The two biggest things that most people will notice are The Launcher and The Dash.  They do seem to make the whole experience nicely functional.

In the beginning, I do admit that I was tentative going in.  I didn’t know about Unity but I’m really glad that I did.  It very quickly became second nature and an effective way to navigate.  If you’ve got a spare computer in need of a refresh or your have a Windows machine that you don’t mind dual booting, I would encourage you to download and give this a shot.  I think you’ll be pleasantly surprised.

Getting it Right … Financially

One of the best inspirational things that I do for myself is subscribe to The Daily Papert.  In this mailing list, I get a daily bit of inspiration from one of the greatest minds  in educational technology as curated by Gary Stager.  Every day, there’s a quotation related to education and usually with a technology overture.  I would encourage you to enter your email address for a daily shot of inspiration yourself.

I don’t think that there are too many naysayers about the use of technology in education these days.  But, for all of the enthusiasts and for those remaining naysayers, the conversation almost inevitably turns to money and how we can’t afford the technology.  For years, we’ve tinkered and tried pilot projects (how many times do we have to prove that technology can motivate students?)  We’ve talked about Maine and other 1:1 projects and lusted after the opportunity to replicate but it always comes back to money.  In Monday’s Daily Papert, it was addressed.

From The Daily Papert, April 4, 2011

Now, what’s really interesting is that the prices in Mr. Papert’s quotations are from 1983.  It isn’t a huge leap to imagine what the dollar figures are today, almost 30 years later.  Now, we’re not about to sink dollars into Apple II computers, but there are current technologies that would be equivalent in terms of today’s functionalities.

We do have to be financially responsible.  Of that, there is no question.  That’s why another article that appeared has so much interest.  Ewan McIntosh’s entry “Why the cloud’s important for education: saving $199,995 on one test” will make you stand up and think.  Look at the issues that Mr. McIntosh identifies.  School boards spending all kinds of money providing internal services when there are free and/or better services readily available on the web.  Of real interest to me is that amount reportedly saved on the administration of just one test.  Imagine the possibility of removing all of the administrative costs and paper booklets and all the costs that go into offering these things.

However, a computer is just a computer until you load it up with the necessary software.  In Ontario, we are fortunate to have a program like the OESS which licenses software recommended by OSAPAC for publically funded schools.  We are also lucky to have resources like those provided by eLearning Ontario.  Despite the successes of these programs, they don’t provide all that is required for a well-rounded suite of software for students.  Fortunately, there are other great alternatives.  If we delve into the concept of appropriate FLOSS, the opportunities get better.  If we expand our definition of just what software is, web services can fill the job nicely.

Web services remain an emphasized question.  Some districts have policies that are restrictive while others less so.  These policies are undoubtely created by well meaning internal structures.  However, a thoughtful, structured approach identifying just what is needed would send a set of guidelines to districts throughout the province.  After all, we have an Ontario Curriculum loaded with references.  Getting serious about all of this would enable a suite consistent throughout the province.  And, if a web service proves to be not needed on a particular date, the provincially licensed Net Support School software lets the teacher turn it off at the class level.

In this link, I would encourage you to add your favourite software (however you elect to define it).  I’ll collate all of the responses and report back in a later post.

Are we ready for more pilots and more tinkering or is it time to get at it?  If we take the finances out of the discussion, does it make a difference?

Plan B

One of my favourite television shows of all time was “The Practice“.  It was a show about lawyers and one of their strategies was “Plan B” which would be used in certain occasions as part of the defense strategy.  It made for great drama and I remember the phrase “We’ll Plan B them” just as if I’d watched the show last night.  I’ve adopted the term myself and use it to represent alternative plans.

If you know me, you know that I’m a big Minnesota Vikings fan.  I’m not a Favre-come-lately.  In fact, my first purple football jersey was purple and featured the 44 of Chuck Foreman.  That bring backs great memories of Fran Tarkenton and Bud Grant.  Last night, the Minnesota Vikings had to kick in their own “Plan B”.  With the winter weather, we’ve all seen the collapse of the roof of the Metrodome, the playing home of the Vikings.

So, what was their “Plan B”?  They hopped onto an airplane and played their game instead at Ford Field in Detroit, the home of the Detroit Lions.  As we now know, this strategy was not friendly and they lost badly.

Last week, the Western Regional Computer Advisory Committee hosted its annual Symposium for technology leaders in the South Western corner of Ontario.  We hold it in the heart of Ontario’s snowbelt where it’s not uncommon for huge storms to pick up the moisture from a yet to be frozen Lake Huron and dump it on places like Grand Bend, Strathroy, and London!  It has never happened but what would happen if a keynote speaker that we invite from warmer places was unable to attend?  Well, we have our own “Plan B”, a closely guarded secret that would implemented if necessary.

In both of these cases, “Plan B” would be less effective than the original.

In the use of technology in schools, I can’t help but wonder if we aren’t stuck in a perpetual “Plan B”.  My Faculty of Education students recently came back from their placements and expressed their frustrations with their teaching environments.  At the Faculty, we work with dual boot iMacs with the Mac OS on one side and Windows 7 on the other.  On each partition, we have access to the entire suite of OESS licensed titles.  Everything that’s available is installed and functional.  The machines are also equipped with an area that allows us to install things on the fly, if necessary.  A good example of this would be Microsoft’s Small Basic which wasn’t available in time to ask the technical to have available for us.  But, we needed it for a recent practice lesson and so the student teacher mounted a sharepoint on the instructor’s computer and we all accessed the installer from there and were up and running in seconds.  There’s a “Plan B” that worked.

Is that the norm or the exception in a typical K-12 classroom?  Hardly.  The reality is that you need to plan at least a year in advance for the use of technology and then hope that all is good to go when you need it.  If it’s not, do you have the ability to put a “Plan B” in motion?  If not, why not?

Typically, the answer lies in the way that technology is managed in schools.  Rather than having realistic support levels, we generally have enough support to just get by.  In my previous post “Time to Consider 2.0“, I made reference to a posting that helped you discover if your technical support was 1.0.  I just received a rash of emails from folks who wanted to try this or that and were unable.  Like my faculty students, they had planned their lesson at home or read about it and wanted to try it in their classroom and were unable.  So, what’s their “Plan B”?

Maybe it’s time that we revisit the original plan.  Plan A?  As long as we continue to purchase industry equipment equipped with full-featured operating systems designed for every conceivable option, we’re going to be locked into this perpetual loop of doing things.  Is it realistic to use a computer and network so bloated and locked down that even the process of booting requires alternate entertainment?  I’m really intrigued with the instant boot and full access to technology that iPad and now the Google CR-48 provides.  Somewhere along the line, we’ve had to modify expectations about what computer technology can do for the classroom.  I can’t help but think that we need to be rethinking and regearing to reflect a more usable and reliable experience.  I’m really enjoying reading of the successes that people are having with iPod and iPad pilot programs.  The CR-48 is too new to have reports but it may well be a viable solution that gets students up and running and on task.

Imagine a educational technology world where “Plan B” isn’t necessary!

Powered by Qumana