Why I Left Evernote

I’ve used Evernote since about 2005 or so. I collected over 6000 notes in that timeframe. I’ve found those are rookie numbers. Some people have tens of thousands of notes. But these are mine. They’re websites I’ve clipped for future reference. Journal entries. Planning. Ideas.

I left Evernote at the end of 2016 when they said they’d allow employees access to notes in personal accounts. I have some very personal notes in there, and that spooked me. I switched to OneNote, but came back to using Evernote in 2022.

Bending Spoons bought Evernote since then. That didn’t bother me. Then they laid off all the original Evernote staff. Then it seemed like they were neglecting the product, which had gotten very slow. A lot of users fled for other products like Notion or Obsidian during this time.

Bending Spoons eventually started providing updates, and sent out a communication that they had to overhaul Evernote’s foundations to make it future proof. It got much more responsive. But they also jacked up the price to $129.99 a year for a Personal account, which is the plan I’ve been on for many years. It used to be around $30, then got raised to around $60.

I was due to renew in January. At first I thought about keeping it because I’ve been with the product for so long and it does its job well. But then Evernote started crashing on my iPad when I tried to edit any note. I reached out to support, but they stopped responding after I sent my activity logs. For my workflow, I’ve got to have a product that works on my iPad.

I attended a Meetup for a professional topic and tried to take notes in the browser. When I got up in the morning, I had a note with a title and no content. At least I took screenshots of the slides so I could stick them in the note along with what I could remember. But this was an intolerable failure of the product.

But where to go? There are many options available. The Keep Productive YouTube channel maintains a Toolfinder website: https://toolfinder.co/ Take a look and see how overwhelming this can be.

I tried to get Notion to import from Evernote earlier last year. It only imported about 400 of my notes in my main notebook. I tried to get another import. It actually did it twice, but with a complete export of Evernote I use Notion now. The webclipper doesn’t work (at least on Safari) but so far I’ve been able to copy and paste anything I went to keep into Notion.

I started exploring Tiago Forte’s PARA methodology. Notion seems to work well with it. I think Tiago also completely left Evernote for Notion.

If you’re curious about Notion, there are more videos on YouTube about it than you could hope to get through in this lifetime. It’s a very powerful product. Maybe it’s overkill for my needs, but I can grow into it.

Giving Linux Another Try

I’ve used Linux off and on since 1998 or 1999. I bought Red Hat off the shelf originally and installed it. At the time, my video card wasn’t supported, so I couldn’t get a Graphical User Interface (GUI) to work. I played with the command line for a while, then reinstalled Windows98.

A few years later, I bought SuSE off the shelf. I had enough old parts to build a separate computer, so I installed it on that box. I had a Windows computer and a Linux computer. I don’t remember doing anything extraordinary with it. I browsed with it occasionally.

I’ve tried a few other variants. When Debian released an .exe to install from Windows, I tried it in a Virtual Machine and wrote a blog post about the results. Ubuntu is one of many forks of Debian Linux.

But typically, I found Linux too limited for my use. I guess I’m spoiled by Windows and Mac applications, and if there is an alternative on Linux, it rarely measures up and is not interoperable with Windows and Mac users. Web applications have alleviated that to a point, but sometimes it’s better to just have an app that runs on your own device.

People on Linux forums are generally friendly and want to be helpful, but they are not helpful. Even today, visit a Linux forum and ask “Which version should I use?” You’re likely to get the response “Whatever works for you.” Now, I’m fairly technical and have some Linux experience, and that barely makes sense to me. Imagine some 70 year old Boomer who can barely turn the computer on and doesn’t understand the difference between the operating system and Microsoft Word. How helpful is that going to be? Most people do not want to spend weeks installing Linux distro after distro until they finally land on that magic one that “works for you”. I wish Linux people understood this.

You also hear things like “You should compile your own kernel!” Why would I want to do that? Why would anybody trying to leave Microsoft and Apple want to just jump into that end of the Linux pool? Once you get REALLY good at Linux, sure, compiling your own kernel might be fun, but for the casual user who just wants to use their computer, it’s not fun. It’s very complex, and you need a deep understanding of kernels to do it. I tried it once, and I was lost. I didn’t understand what most of the components were and eventually ran out of patience for Googling them and why I should include or exclude them. If you want to promote Linux adoption, stop telling n00bs they should look at compiling their own kernels.

You can occasionally find posts and videos that will tell you about various distros, but they’re generally of the variety “Here’s this one. You might like this feature. This other one has this other feature. You might like that too.” It’s like they’re trying to be TOO impartial.

You may hear testimonies like “I breathed life back into my old Dell laptop by installing Linux! It runs fast!” So you take an old laptop, install Linux, and cannot duplicate the experience at all. It may run slower than Windows, if it even runs.

I took a Raspberry Pi 3 a friend gave me a while back. I tried running Ubuntu on it, but it was too slow. Ubuntu requires 4GB of RAM, and the Pi 3 has 1 GB. The Pi 4 is out now with up to 8GB of RAM. I got the Pi 3 running with Raspberry Pi OS (formerly Raspian) so I can play with it.

I figure I’ll dig back into Linux and try to find a way to think about it in a way that makes sense to me. Some of it is probably the heuristic Linux is based around. Linux is an x86 port of the Unix operating system. Unix was developed “by programmers, for programmers”. It makes sense to programmers, but not so much for the rest of us. I am not a programmer. I’m technically capable of learning it, but my experience in a Java class for my Bachelor’s degree made me swear off programming. I know enough to work with programmers and to manage software projects, but I haven’t been interested in writing programs since that class.

I’m glad Linux handles hard drive partitioning now. I remember having to manually partition my hard drive before I could even THINK about installing Linux. And Linux required at least 2 partitions, sometimes more.

New iPad Pro/iOS 11

The newest Apple announcement (I think it was last week) had some yawns and a few things to get excited about. Let’s talk about the 10.5″ iPad Pro. According to Mac Rumors:

I’ve seen the 12″ iPad Pro. It almost seems too big. Haven’t bothered checking out the 9.7″. I have an iPad Mini 3, which I don’t use all that much.

I’ve had the iPad 1,2,3 and the Mini 2 and 3. (The Mini 2 was from a failed mobility project at work. They deactivated the cell data and told me they didn’t want the tablet back).

The iPad has a buttload of promise, but one huge shortfall: it’s still a MOBILE operating system. That means you can only do one thing at a time. My personal “workflow” normally involves having a YouTube video playing in the background, or a podcast, while I’m going through email, reading RSS, or whatever else. You CANNOT do that on an iPad. OK, I have two Minis, so I guess I could play the YouTube video on one while I work on the other. Yeah.

Apple seriously needs to put some desktop features on the iPad, specifically the ability to open a YouTube video in Safari, then switch back to Apple Mail. Normally, when Apple goes through its annual update to iOS and MacOS, I see things like the following:

Update to iMessages!

Update to Photos!

14 freaking cameras that work as one!

Undecipherable update to Apple Music!

None of which excite me. I use an Android phone, so iMessages is useless to me, and I have yet to get excited about a new iPhone due to the above. Photos has some usefulness, but for my purposes, the built-in Windows photo viewer does what I need. And on my Mac, I just go through my pictures in Finder. Don’t care about cameras; ever since the iPhone 4 era, phone cameras have been at the resolution I need. And I don’t use Apple Music, which only works on Apple products. I’ve been using Amazon Prime Music lately (not the extra subscription service.)

According to Mac Rumors, iOS 11 does seem to have some interesting features, (at least on the iPad Pro):

Let’s see what Apple has to say.

Files – could be a game changer. Imagine that; actually being able to manipulate the file system on a tablet (you could do this on Windows tablets back in the 90’s… welcome to the future, Apple!)

The Dock – could be interesting. Is this only on the Pro, or will I also get this when I upgrade my Mini 3? (Yes, at the bottom of the page, both the Mini 2 and Mini 3 will get iOS 11. But will they get ALL THE FEATURES?)

Multitasking – if this is available on my Minis, that would be great. I could use them more.

Drag and Drop – again, you had this capability on Windows tablets back in the 90’s. You’re way late to the party, Apple!

Apple Pencil – will this work on the Mini?

Instant Markup – Adobe has had this on phones and tablets for years, but having it native would be useful.

Instant notes – again, had this in Evernote and OneNote. My cheap ass RCA Windows 10 tablet from Walmart came with this feature. But having it on the iPad means we’ll think Apple invented it!

Inline Drawing – looks pretty cool!

Scan and sign – part revolutionary, part late to the party. Evernote did this years ago. I had to scan my entire 110 page divorce agreement in 2015. I used Evernote for this. But to be able to sign… will anybody accept it? I tried to sign documents in Adobe with a stylus and had them rejected. I had to revert to 1848 technology (yes, the fax was invented THAT FREAKING LONG AGO) and “wet sign” and fax said documents, even though my signature looked exactly the same. I couldn’t tell the difference. How could they?

Quicktype Keyboard – interesting. I’ll have to play with it.

Augmented reality – just an API. Others have to develop apps for it. I had an AR app on the Nokia 920 I had in 2014. It was pretty cool. And Apple has had the ability for people to develop hardware that plugs into iOS devices since the iPad 2. I don’t know if anybody ever did.

That’s enough. I don’t get paid for this.

Chances are, I’ll get the public beta when it’s available. I’ll make some updates after I’ve gotten my hands on it. Now I need to look up MacOS 10.13. High Sierra. Although it’s been years since Apple introduced anything revolutionary into their MacOS.

WannaCry… Who Should Get The Blame?

I’ve heard Friday’s cyber attack called both “WannaCry” and “WannaCrypt”. I”ll stick with WannaCry for now.

As we know, on Friday, tens of thousands of users in about 150 countries were hit with a cyber attack that encrypted their hard drives and locked them out of their computers unless they pay $300 worth of BitCoin. After a few days, it goes up to $600 BitCoin. (I assume, for larger organizations, that’s $300 PER COMPUTER).

Obviously, governments and cybersecurity “experts” are telling those affected not to pay, and to trust those governments and experts.

My professional opinion? Pay the ransom, learn your lesson, and NEVER let it happen again. I guarantee you, $300 worth of BitCoin will be a LOT cheaper (assuming you take corrective measures) than bringing in experts to recover your systems, and of course the lost business and efficiency.

Biggest Factor

Pardon my French, but this attack was apparently VERY FUCKING EASY to prevent. The #1 factor involved was: outdated versions of Microsoft Windows, mostly Windows XP.

Background

Let’s review: Windows XP was released in 2001. I remember being very excited to get it. About that time, cybersecurity started becoming a big issue, and Microsoft had to devote a TON of resources into beefing up XP’s security rather than develop a new version of Windows. Windows XP Service Pack 2 was released in 2004. This incorporated a firewall and some new security features. Now, Microsoft was able to work on a new version of Windows, and shit out Windows Vista in 2006.

By most accounts, Vista was a flaming piece of crap. I liked it over XP, as it had some neat new productivity features, but it was a gigantic resource hog. It apparently needed 4 GB of RAM to run decently, at a time when most consumer computers came with 256-512MB. Also, Vista had some substantial changes to the system and security architecture that are still causing problems for those too stupid and cheap to upgrade from Windows XP.

The head dude in charge of Vista was fired or put aside, and Microsoft came out with Windows 7 in 2009. Windows 7 is what Vista SHOULD have been in the first place. It actually ran very well without needing a top end computer. Windows 8 followed around 2012, then Windows 10 in 2015.

I liked Windows 8, but I think I was the only person on Planet Earth who did. Most people couldn’t stand it. I’m smart enough to spend the 5 seconds I needed to on Google to figure out how to operate it, and I never had a problem. But Microsoft had to build back in the legacy features from Windows 7 because everybody else but me can’t handle change or 5 seconds on a search engine. (This includes Bill Gates, who allegedly came back to Microsoft part-time, was given a Windows 8 computer, and demanded to go back to Windows 7 because this software genius can’t handle a search engine either).

While all of these versions of Windows were going on, people got so dependent on XP that Microsoft was forced to keep supporting it. They originally intended to end support in 2008, but ended up extending a couple of times until 2013. Many companies had custom applications that were practically hard-coded to only work on XP. I knew a dentist who, in 2010, went to buy some new Windows 7 laptops. He tried to run his dental application, and it wouldn’t work. He called support, and sure enough, they didn’t support Windows 7.

I see a doctor who uses Windows Server 2003 for his application. Every freaking time I go in there, I bring this up. Sever 2k3, like XP, is long since out of support. I ask the doctor “Are you trying to get hacked? Are you trying to get my patient data, and that of all the rest of your patients, stolen?” He keeps telling me he’ll bring it up to the people who provide their IT services, but so far, nothing has happened.

I go to another doctor who does everything by paper. As much as the IT Professional in me cringes at the stacks and stacks of paper and records in his office, I realize there’s no F’ing way he’ll get hacked. Of course, an office fire, a break in, or a misplaced record will have the same effect, so you’re screwed either way. You might as well embrace IT and TAKE REASONABLE MEASURES, and yes expenses, to protect it.

Second Biggest Factor

When it comes to any type of security, your biggest threat comes from inside. It’s your users, your employees, even your family. And it’s not even because they want to be. They’re just people (or sheople) stumbling through life without paying much attention.

For a class I took last year, I had to take a cybersecurity simulation. The set up is, you’re running an IT organization for 4 quarters, and you have a budget. You can only spend so much each quarter to protect your network. You can spend it on appliances (firewall, IDS, IPS, etc.), user training, antivirus for computers, and so on. But it’s a limited budget. And I had to get at least a 95% before I could submit my certificate. I was at it for hours.

I remember one time in particular, several rounds in. I’d somewhat gotten a feel for what areas I had to cover with the limited budget. Like, you can’t just give 5 rounds of user training and forget to install a firewall. So I had two good quarters, and defeated all the cyber attacks. Then, at the end of the 3rd quarter, the simulation hit me with 3 social engineering attacks in a row. All were successful, and I had to play again. I finally got a 96% on one round, saved the certificate to pdf, and emailed it to my instructor. I was not going to try to top the score at that point.

A user can be totally subversive, or a double agent, spy, or whatever form of actively working against you. But you probably have far more to fear from casual carelessness or just not understanding security.

We’ve all had a casual friend with an email account that started throwing off spam. Suddenly you get poorly worded English from them telling you to click a link. I always catch them; most people don’t. I tell the person to change their password. They probably change it from password to password1, and keep sending off spam as soon as the spammer cracks the new password. I just mark them as spam, since I don’t normally correspond with those people by email. But most people get an email with something like “You have GOT to see this!” and they click the link, which brings malware onto your network. And if you’re still stupid enough to be running Windows XP, now you’re infected.

Or, consider this scenario, which I used to explain social engineering to my wife:

<phone rings> “Random hospital, Karen speaking.”

Social Engineer: “Hi, Karen, I’m looking for John Smith.”

Karen: “There’s nobody here by that name.”

Social Engineer: “I’m sorry about that. Must be a wrong number, but I talked to John Smith from Random Hospital. Maybe you can help me, Karen.”

Karen: “Sure, what can I do for you?”

(Most people want to be helpful.)

Social Engineer: “I’m working on a proposal to upgrade Random Hospital’s computers. I’m wondering if you can tell me what operating system you’re running. I want to give you better equipment if I can.”

Karen: “I don’t know much about computers.”

Social Engineer: “That’s OK, it’ll just take a second. I can walk you through it. I really appreciate your help, Karen. Click on Start…”

<walk through of finding OS version, maybe browser version and what antivirus>

Social Engineer: “OK, so Random Hospital runs Windows XP, Internet Explorer 7, and McAfee. Thank you, Karen, I appreciate your help”.

Now, Social Engineer knows Random Hospital is stupid enough to be running an out of date, unsupported operating system with well-documented vulnerabilities. Dis gon be gud!

That’s all it takes. Or digging through a dumpster. Social engineers can get a TON of good information from all the crap users throw away.

And that’s just from an employee who was trying to be helpful, not subversive.

For the record, I don’t tell people shit over the phone. I don’t look up numbers or give names to them unless I know who they are. And it’s not that I don’t want to be helpful or friendly, but because I know how social engineering works. And I’m not going to be the idiot who compromises my organization.

I can do this all night, but I think you get my point. WannaCry could have all been prevented IF the affected organizations were running currently supported operating systems with recent patches and updates applied, which can help mitigate user carelessness.

There’s one final factor we need to look at:

The Deep State, Unaccountable Spying Agencies

Here in America, we have the National Spying Agency and Cocaine Importing Agency. From what I’ve read, the very exploit that caused all of this was built by one of them (I don’t remember which, and it probably doesn’t matter). The recent Vault 7 leaks included some malware developed by one of them and left on an insecure directory that was apparently easy to access from outside.

Heads should roll for this. Both agencies most likely need to be gutted, involving people being fired and/or prosecuted. Proper lines of control need to be drawn and enforced.

Conclusion

I saw some claim today that this is all Microsoft’s fault. They should have left Windows XP in support forever.

Right. Should your car manufacturer be forced to support whatever car you drive indefinitely? Sooner or later, things break down and you need to buy a new one. Even if we all agreed to just freeze technology where it’s at forever, to never again develop newer hardware or software, maintenance still needs to be done. As people develop new exploits, those have to be patched. Sooner or later, the best way to defend against those vulnerabilities is going to be radical changes to the architecture of the operating system and software that runs on it. So no, this is a bad idea. Plus, how are the tech companies supposed to make money if they can’t convince you that you HAVE to buy a new phone every year?

I got into IT because I’m excited by new technology, new features, and new capabilities. I’ve spent most of my career frustrated by the baby boomers and people afraid of change forcing us to keep doing things the old way. I’m convinced that email is pretty much obsolete, but the biggest tool I still use at work is Microsoft Outlook. And I finally got Office 2013 on my work computer. I’ve been running 16 at home since last year.

(I wish I could get into independent consulting, but I don’t appear to be entrepreneurial).

A lot of people in IT need to get out of it. Go find something else to do, and stop holding the rest of us back. Or actually, start learning about it so you can do it effectively. Read some books or magazines, or watch some YouTube videos that don’t cost you anything. Learn and grow.

IT is the ultimate cargo cult. Everybody thinks it should be easy and fun. How often do you hear somebody who can barely charge their phone say “I’d like to get into IT!” My wife was saying that when I met her. I was able to make her head spin enough with my own knowledge of IT to convince her it’s not a good idea. No, of course, I’m willing to help her, but once she realized what was involved in learning it, and how getting in with no experience is a pay cut and a shitty help desk job, she changed her mind.

I guarantee a lot of IT departments need to start firing people over this WannaCry episode. If you run a business and don’t know much about IT, make sure you bring in someone who does. If you contract with a 3rd party to provide your IT support, make sure they have a plan for obsolescence. What are they going to do when Microsoft releases a new version of Windows and discontinues support for the current one? If their answer is “Oh, it’ll be fine…”, DO NOT HIRE THEM!

Android Central: The 5 Worst Things About The Galaxy S8

I have a love/hate relationship with Android. I like to skip around between the various phone platforms so I can maintain knowledge of them. But, Windows Phone is effectively dead, since nobody makes or maintains apps for it and Microsoft just can’t commit to a consumer strategy, and Apple isn’t impressing me lately. All they seem to care about anymore are iMessages and cameras. And jacking up the price point.

My last phone was the Note 5. At one point, I swore it would be my last Android phone. I found it to be buggy and unreliable, but it never blew up, so it had that going for it. The battery life was terrible. My first phone had to be replaced by warranty because the battery would be dead in 3 hours even if I shut everything down and didn’t touch the phone. My replacement didn’t do much better.

Two factors have me locked into Android, and specifically Samsung. Those are:

  1. Lastpass integration
  2. Samsung Pay

We all have so many damned logins and passwords it is impossible to keep track of them all. Everything requires an account. Everything. I have Lastpass pro, which costs $12 a year. On every other device, every time I have to log into an application or website, I have to bring up Lastpass, check my username, copy my password, switch back to the app or website, and paste the password in. On Android (when it works), I just authenticate to Lastpass with my fingerprint and it fills the details in for me. This simplifies life.

I didn’t care much about the mobile payment apps before. Most features the tech companies roll out seem to only matter to people who live in San Francisco or New York. Not so much northern Virginia. I used ApplePay once at Wegman’s when they were running a beta test. Then I found out Samsung pay used a technology called MST, which induces a magnetic field into the credit card reader and works with almost all of them (except gas pumps). So I started using my phone to pay for everything, which is a hell of a lot easier than pulling my wallet out, getting my check card out of its sleeve, swiping, and trying to fit it back into the sleeve. And for now, paying with my phone still blows people’s minds, which is kind of cool.

When it came time to decide what to do about the frustrating and unreliable Note 5, I had a few things to consider. Get an iPhone 6s+ or 7? Wait for the S8? Or get the S7 Edge?

About a month prior to the S8’s release, Samsung started selling the S7 Edge for $200 off to eliminate inventory. I decided to just buy it from them, and did. The S7 Edge had been out for a year and had a proven track record. People I know love them. So I got one.

And when I read things like this, I’m glad I did. I haven’t really heard anything good about the S8. Sure, it’s all pretty and stuff, but like most technology, it’s going backwards. They make a nice design and add a few new features (most of which are pointless), but other than a slightly newer processor, there’s nothing earth-shattering.

The video describes Bixby as a 2012 era Google Now. I’ve never had much use for Google Now. I guess it’s good if you live in San Francisco, take public transportation, care about sports, and can afford to eat at expensive, reservation requiring restaurants.

It supposedly has some feature (powered by Pinterest!) where you can use the camera to get it to show you images similar to what you’re looking at. But it doesn’t look that useful. This is another case of technology going backwards. Back in 2014, I had a Nokia 920. It came with an augmented reality app that was very useful. I don’t remember what it was called, but you could take the phone out, power it up, and scan around. Through the camera, it would give you information about your surroundings, and what was behind them with clickable links that opened in the appropriate app. I haven’t come across anything like it since. I’ve had Google Goggles on my last two phones, and it’s nowhere near as useful as that app on the Nokia 920 was three years ago.

The fingerprint scanner on the back of the phone looks like a serious pain in the ass. I’m fine with scanning my thumb on the front of the phone. It’s convenient, and I don’t have to turn the phone over to do it.

Another backwards feature: say you invite me over. You text me your address. 3 years ago, on the Nokia 920, I could click the address, it would ask me which app to open it in, and I could select Waze. Now, it automatically opens in Google Maps. So I have to manually copy the address, open Waze, and paste it in there so I can navigate over (Google Maps sucks for navigation, and it doesn’t show you where the cops are.)

I’d love to see technology start moving forward again. I’d love to see some truly revolutionary stuff. New ways of rendering the Human/Computer Interface. Augmented reality. But it seems like all we’re getting right now are dual cameras and messaging platforms. And mindless games.

Speaking of which, does anybody remember ICQ from the late 90’s? We had all these IM platforms like AIM, Yahoo, etc., and ICQ tied them all together into a single application. We don’t have that now. iMessages is about the closest thing I’m aware of, but it only works on a complete Apple platform. I’d love to be able to text from my computer and pick the conversation up on my phone, whether I’m using SMS, Signal, Telegram, or whatever.

Has Microsoft Learned Nothing From Windows RT?

Remember the launch of the Microsoft Surface in 2012? They can in two versions: the Surface RT and the Surface Pro. The Surface Pro ran a full version of Windows 8.

The Surface RT ran a version of Windows 8 that was designed for an ARM processor, and could only run apps from the Windows Store.

The Surface 2 was the last RT version. The Surface 3 ran a full version of Windows. Why? The RT wasn’t selling well, because of its tie-in to the Windows Store.

There has never been a point in time when the Windows Store didn’t suck. There are a few good apps on there like Wunderlist and Evernote, but for the most part, nobody supports Windows apps. Facebook recently pulled their app (I didn’t know it had lasted that long, but recently saw somebody complaining about it on Facebook).

Now, Microsoft is back with Windows S. It sounds great on the surface. It’s stripped down and agile. Sounds cool, right? What could go wrong?

From Microsoft’s own site:

Microsoft-verified security

Your applications are delivered via the Windows Store ensuring Microsoft-verified security and integrity. Microsoft Edge is your default browser since it’s more secure than Chrome or Firefox.1Windows Defender and all ongoing security features of Windows 10 are included.

Yep, it will once again, only run apps from the Windows Store. The same Windows Store that Microsoft just can’t encourage ANYBODY to develop for, or maintain apps they dip their feet in the water with.

Not that you need a lot of apps. Early in the days of app stores, most of us were constantly downloading and trying out new ones. For the most part, we’ve figured out what works for us, and since all they’re doing anymore are messaging apps, there’s no point in looking for new ones, unless you like mindless games.

I don’t mind Edge. I have a cheap, Windows 10 tablet that I got to play with. I only use Edge on it. Since they came out with extensions for Edge, I have the functionality I need (Lastpass, Pocket, etc). Internet Explorer 11 sucked balls. It still sucks balls. I have to use it at work, although I finally got Firefox installed on my work computer, so I only use IE for work related sites. Edge isn’t bad. I still use Chrome on my Mac because Safari sucks and Brave isn’t close to prime time yet.

I’ve seen a few headlines that if you buy one of those new Surface laptops, you can get a free upgrade to Windows 10 Pro. You probably should. The Windows Store is about useless.

And The “Stupid App Of The Year Award” Goes To…

Egalitarianism is bullshit. I don’t know what women think “equal” means, but this isn’t it. If you want to run with the big dogs, you have to be able to keep up. That means dealing with being interrupted without launching a campaign and getting somebody to build a silly app. Everybody gets interrupted; not just women. Oh yeah, and inventing stupid new words. We can play that game too: Bitchterruption (womanterruption is too long and sounds too stupid). Bitchsplaining.

iPad Only?

I’ve been using Michael Sliwinski’s application Nozbe for a couple of years. It’s not perfect, but what is? Wired had an article a while back about how it’s 2016 and why can’t we have a decent productivity app? There are tons and tons of productivity apps. Those that are powerful on the desktop either aren’t present or are pathetic on mobile. (MyLifeOrganized is an example). Those that are good on mobile don’t work well on a desktop type system (I consider laptops on that category- a full fledged Windows or Mac system). Nozbe seems to hit the high points and has a consistent user experience across all platforms.

Michael and a co-author wrote a book called iPad Only. I haven’t bought the book yet. Based on user reviews, I do not perceive it to be worth $10.

I have tried over the years to figure out a true mobile experience. That’s not easy for me though. I took a class a while back where I didn’t have much desk space. I brought my MacBook Pro the first day, but after that I used my iPad Mini with a BlueTooth keyboard the rest of the week due to desk space. It got the job done all right. Thanks to the integration of cloud storage, this helps, assuming you have a consistent Internet connection.

The iPad has definitely come a long way since the beginning. At first, it was pretty much a media consumption device and still very limited at that. Along the way, better apps and technology were integrated into the platform. Now it has Microsoft Office and other productivity apps.

I think the biggest limitation on the iPad at present is the fact that it is STILL run by a mobile operating system. While Microsoft has Windows 10, which even on their phones is a full fledged OS, Apple is still running mobile.

I’m convinced that the biggest revolution in mobility is a phone sized device that is a full-fledged computer. It has a full operating system, plenty of storage (at least 1TB), and is capable of docking to a laptop sized device or a keyboard and monitor for heavier duty tasks. And it appears such as device is here, or almost here: the HP Elite X3. The Lumina 950 and 950 XL have a similar capability, but not as much on board storage. No way you could keep your music library on it.

Verified by ExactMetrics