People don’t fear change that enhances their lives

I have spent a lot of the last 24 hours reading discussions on the subject of Ubuntu and Unity in particular. I had (and have again) Linux Mint install but following issues linked to the screen lock with processes running in a Python window, I temporarily switched over to Ubuntu.

In the time that it was installed, I discovered user interface design decisions which appeared to be made with no consideration of users, and it crashed a couple of times. It’s gone and I have gone back to Mint, reconfigured screen savers (ie, switched them and screen sleep completely off) and the issues which I had previously do not appear to have (yet) remanifested themselves.

But Ubuntu…Someone in Canonical thought it was a good idea to a) remove the application menu from the application window and b) put it on the global menu at the top of the screen and c) hide it.

The first time this charade manifested itself was with Sublime Text – my text editor of choice for most serious work – and I could not find the menu. It’s one thing to take it away from the application window – unwise in my view but not unknown and probably tolerable. Hiding it was not.

I know that Canonical have done something about this with 14.04 which released very recently. But this fiasco has been reality for a few years now and a lot of people screamed blue murder about it. It may be a small and cosmetic thing but it interferes with usability. It may seem overdramatic but it is the one single feature of Ubuntu that made me decide that the desktop environment was unusable for me. Its key outcome was to make software I wanted to use and was reasonably familiar with much, much harder to use. The fact that it took nearly 3 years for some sort of a fix isn’t really that edifying to be honest and few people are going to put the very newest version of a piece of software on when a) they know it’s about a week or two in release and b) they need some form of stability.

I’m aware that Ubuntu’s response to criticisms of Unity has been to recommend other distros. When I come to Ubuntu as a new user, that doesn’t really make me feel that Ubuntu is particularly interested in dialogue with your users. No matter how free your stuff is, no one is going to want to use it if they think they are being stomped on.

The other thing which someone decided was that no one really needed any sort of a reasonable hierarchical application menu. Up front, if you wanted to get at your applications, you had to search for them either through the general lens or the application lens. There are some benefits to being able to do a search like this. However, there are wholesale user disadvantages to not having a reasonable hierarchical and catogorisable view of your software as well. For all the world’s complaints about it, even Windows 8’s Metro UI allows you the option of arranging your applications in a logical set of groups. Linux Mint gives you a menu.

Ubuntu gives you a search field. That’s fine for documents and for email in my gmail account. It is utterly frustrating for managing applications and more specifically, launchers for your applications.

There is only so much real estate in the not-movable launcher on the left handside, and anyway, the first thing you have to do on installing Ubuntu is to get rid of the – I was going to say junk – but shall we say “stuff you don’t need” before you can do anything. The default install size of the launcher is too big (but at least that can be customised) and it comes with a lot of Libre Office stuff and a direct link to Amazon.

I remember when Windows machines used to come preloaded with all sorts of commercial launchers on the desktop. I didn’t like it then and I don’t like it now. And yes, I know Ubuntu is free.

And this is its big problem. It’s possible that if it wasn’t free and easily replaceable with other free things, I’d spend two or three days getting rid of Unity, installing a more functional desktop but of course I have to go and test a bunch of them before hoping there are no stability issues. The great beauty of Linux is that you can do a lot of customisation (although some of that is seriously limited within Unity). The great disadvantage for Linux is that sometimes, people don’t have enough time to do this. They have tasks they want to achieve, they know that in theory they are easier to achieve in Linux than they are on Windows (viz some Python related stuff and running a few other open source applications like R). Ultimately, there is a lot to be said for ensuring that when they open a basic, high profile distro, it works.

Most of what I’ve seen written about Unity by users – viz people who comment on blogs as opposed to people who write blogs – is that they’ve gotten used to it. It seems to be more a resigned tolerance than anything. A lot of people have switched over to Linux Mint. A lot have switched back to Debian. A lot have looked for ways of making other desktop environments usable. And a lot complain that it’s only a vocal minority whinging, who don’t like change. Most people, in my experience, don’t mind change which enhances their lives. When it is utterly disruptive and makes their lives harder, that’s an entirely different kettle of fish.

I’m not a long term Linux user. It’s unlikely that I will ever again go near Ubuntu. Unity was unusable and when I looked into it any any detail, it was obvious that Canonical didn’t want to take on board any negative feedback, and it took three years for them to fix – sort of – one of the more annoying interface issues. I know some people find the whole keyboard centric search options fine. But I don’t see it as an OS for people who are superuser keyboarders. I see it as an OS to be avoided by people who are interested in structuring the information and assets they have on their computer. It’s all fine having search to find everything for you, except the few things you squash onto the Launcher. Everything I tried to do with it up front was a struggle. It’s possible that tinkering around with Linux is a hobby and a game for some people. Other people actually need it to function.

In my view, if you want to try Linux, Ubuntu really isn’t the best choice. Stick with Mint for now.

 

 

If I had your data: Job Bridge

Yesterday I read a piece that suggested that the Department of Social and Family Affairs weren’t about to release the names of companies which used the Job Bridge service. So I decided to have a closer look at it with a view to doing some data analysis on the subject.

There are a couple of things I’d like to have a look at in terms of the current vacancies (around 2500 per the Job Bridge site) but I haven’t yet figured out how I am going to pull down all that data (webscraping isn’t something I am prone to do too often). Having looked at a couple of pages of internships I am struck by a few things.

  1. lot of hairdressing and beauty spa
  2. lot of “administration”
  3. lot of comment about “formal/informal” training
  4. lot of boilerplate text.

And that’s just the start of it.

Job Bridge itself provides “Job Bridge Data“. This is not data. This is a summary of aggregated data. I have seen a claim that 60% of Job Bridge participants go on to obtain work after the program but there is no information on that in the “Job Bridge Data” for example. What would be interesting out of that data is where those participants are getting jobs. Are they getting jobs in their host companies and is it pretty much the case that the state is effectively paying for 6 to 9 months of a probationary period? Having done internships myself in a previous life, my experience regarding full time internships is that if companies tended to need them, they paid for them. I know for example that a few of the technical companies here still do, for example.

I have a couple of data projects on the go at the moment, plus some work for my own college course right now and unfortunately, I don’t see a quick and obvious way of getting the current Job Bridge vacancies down to me, even in an unstructured manner. This is regrettable. I know there is ongoing a lot of controversy about the Job Bridge program and to some extent, understandably so – I saw at least one teaching position “must commit for the nine months” and one farm labourer position. These, in my book, are jobs rather than internships. I’m not really that interested in picking out the odd job here and there, however, to make that point. I’m interested to see what sectors are using the program, whether there’s any way that internships can be classified as jobs rather than internships. Some sort of structured data that I can pull into R would be nice. I’d also like to do some spatial analysis on where these are and again, the structure of the site is not lending itself to that because of the odd things like Tipp North, Tipp South, all the Dublin vacancies dumped into one bucket, but the city and county listings being separate for Cork, Galway, Limerick and Waterford. What would be nice is a structured dump of the data.

One can wish, I suppose.

Extraordinary claims require extraordinary evidence

As of yesterday evening sometime, my twitter feed has lit up with claims that a computer has passed the Turing test for the first time. These claims have their roots in this press release from Reading University.

The details of the test and how it was carried out are thin on the ground. We do know from Reading’s press release that one of the judges was an actor and that in 33% of cases, the computer could not be distinguished from a human.

I have a couple of key questions.

  1. What language did the humans interact with the bot in? This is important because the bot is defined as a 13 year old boy from the Ukraine. If the interaction was in English then for me, all bets are off.
  2. Where is the peer reviewed paper?

The Turing Test is, in many respects, iconic. If someone claims to pass it, a press release is going to be nowhere near adequate to support that claim. We need to know a lot more about the system concerned, how it works and how it operates.

 

University isn’t a finishing school, you know

There has been some discussion lately about Jackie Lavin’s contribution to Prime Time on Tuesday 3 June when the subject under discussion was third level education. Amongst her assertions were that some graduates didn’t have a clue and you could cut a year at least off most university degrees.

I have trouble with those two assertions for various reasons. But I have a much bigger problem with all this and that is, other than seeing her complaints about how her banks have treated her, I really have no idea why Jackie Lavin has the media profile she has in terms of business. I mean, I’m aware there was a hotel in Kerry that may not have been completely successful as it were, but otherwise, apart from being Bill Cullen’s partner, I’m not actually sure what her achievements are, and certainly, cannot see what qualifies her to come on national television and claim that some of the graduates she worked with on The Apprentice didn’t have a clue.

Of course they don’t. Common sense is something you get from experience, from trying and failing. Lavin’s comments on this subject are not common sense, in that respect. They are detached from reality. Not only that, the cohort of graduates which might apply to go on The Apprentice cannot be held to be representative of graduates as a whole. It is a self selecting cohort, and if I had a business idea, I cannot say that I’d necessarily consider The Apprentice the best place to be pitching it. In fact, bearing in mind that television is in the business of entertainment and not in the business of business support (except if you are paying for advertising) arguably, there’s a good reason for staying away from the whole exercise altogether. It probably is even, dare I say it, common sense to focus on the needs of your business which may not, almost certainly won’t, align with the needs of an entertainment medium.

With respect to the idea of chopping the length of undergraduate degrees, again, I’m not sure that you can a) generalise and b) comment if you’re not sure what the purpose of an undergraduate degree is. Brian Lucey has provided some commentary here and it is more detailed than I can provide at this point in time. I recommend reading it. That being said, the breadth of options on the undergraduate degree supply side is such that it would be highly unwise to suggest a generalised change to all of them.

There are other issues of course – Paddy Cosgrave already made rather unpopular comments about the standards of degrees between different awarding institutions, and then there is the ongoing suggestion that universities should act as supplier belts to the employers. In truth, it’s not that simple and never has been. It seems to me that what employers value has changed over time; 20-25 years ago there was less of a specialised focus on particular skillsets such as programming a specific language, and more of a focus on understanding how things worked in general.

There are massive, massive cost issues on the academic and government side to tailoring university courses exactly to what any particular employer wants. A key example of this has floated straight to the top in programming languages; Apple have announced a new one, Swift, which, over time, is near guaranteed to replace Objective-C. So the question is, do you want someone who is an Objective-C expert, or someone who can take the handbook for Swift and hit the ground running? The truth is, you’re more likely to get the latter if you haven’t rammed all your efforts into training Objective-C programmers because that’s what the iOS application market was demanding.

Good employers understand this. Good employers understand that there are generalised skill sets they need, and specialised skill sets they may have to arrange for new hires to get themselves. This is true of most employers and this is why continuing professional development matters. It is a recognition that learning is an ongoing activity and you don’t pop out of university aged 22, finished. University is less a finishing school for employment and more a starting school.

There is a wider debate to be had on identifying what we need to focus our educational efforts on but, as anyone who has ever actually discussed this with me in the real world will know, this is not something that we can discuss in terms of any part of the education system in isolation. When we have issues with language learning and mathematics at leaving certificate level, the problems did not start when those students reached the age of 16, but probably when they were 7 or 8.

So I am in favour of a general reframing of our discussion of education in a coherent manner covering the whole, rather than some little details. When you have someone who is famous for – I’m not exactly sure what – in the business world popping up suggesting that we should shorten undergraduate degrees, I don’t think we get that discussion.

When RTE run these discussions, I would like them to consider their speakers a little more carefully. I am not sure what sector Jackie Lavin was supposed to be representing but if she was representing the employers side, then I don’t think she was the best choice. I would like it if, for example, RTE took a serious look at our middle industry and got senior people in from the likes of Kerry Foods, any of our home grown agri-industrials, any of our home grown pharmaceuticals. They tend to need a broader skillset, in certain cases they tend to need to get people to come to less popular locations with the indirect costs that can bring, but above all, they are dealing with different challenges, different realities.

And they are the sector we need to hear much much more from. I would love to see what, for example, Colm Lyons of Realex or Eoghan McCabe of Intercom, or Edmond Harty of DairyMaster or someone from Perrigo might have had to say about the constraints they work under given the current focuses in the third level education sector. I think it might be a lot more nuanced.