Disclaimer

The content of this blog is my personal opinion only. Although I am an employee - currently of MIPS Technologies, in the past of other companies such as Intellectual Ventures, Intel, AMD, Motorola, and Gould - I reveal this only so that the reader may account for any possible bias I may have towards my employer's products. The statements I make here in no way represent my employer's position, nor am I authorized to speak on behalf of my employer. In fact, this posting may not even represent my personal opinion, since occasionally I play devil's advocate.

See http://docs.google.com/View?id=dcxddbtr_23cg5thdfj for photo credits.

Saturday, June 06, 2015

Basis B1 won't sync to MacBook / iPhone

I landed in Atlanta an hour ago.



Since that time, I have been trying to sync my Basis B1 activity monitor  watch.  If for no other reason than to get the watch set to the east coast timezone.



Slowness and eventual failure. Repeatedly.



One big complaint I have about iOS devices in general is that they seem to have much poorer background operation than Android.   For example, on Android my Basis watch would sync without me having to open the application.  Worse, I seem to have to babysit it to ensure that the iPhone does not go into a powersave mode.



I do not know if this is fundamental to iOS, or just a simple matter of programming.   Certainly, many iOS apps & devices behave this way - they operate only when in foreground.



This reminds me of old MacOS, with "cooperative multitasking", rather than fully preemptive scheduling.



This may be a compromise to save battery life. Certainly my iPhone seems to have better life than a Samsung with comparable battery.  But that may be the fault of Samsung bloatware, rather than Android itself.



--



This iOS characteristic does not let Basis off the hook.



The Basis consistently fails at 49% transferred.   At one point I saw a "low memory" warning on the watch.   Low memory when sync'ing - surely they reserve a minimum buffer size to finish a transfer.?




Thursday, June 04, 2015

Making the Outlook Ribbon visible

Making the Outlook Ribbon visible

'via Blog this

For the last few
days I somehow managed to hide the commands on the Outlook "ribbon" -
the dynamic set of menus etc. at top of screen.


Today I finally got
pissed off and googled.

Here's the fix:

Look for the little
arrow like box at upper right.




Microsoft's help
pages were deceptive.


First, the page
title is "Minimize the Ribbon".
Whereas I wanted to restore the ribbon.
Google works better than MS search to find the "Restore the
Ribbon" section title on the same page.




Second, the
graphical widget they suggest
i.e.


Is not what I see on
my screen. 

Instead I see


I do not know if
this is a difference in software versions, or due to Parallels' Coherence
window management stuff.  It appears this
way even when Coherence is disabled.




Overall, Microsoft's
attitude seems to be that you would only do this deliberately.

Whereas I find it
far too easy to do this by accident.
The keyboard shortcut, ctrl-F1, is easy to type by accident.

----

Oh joy, I see that the screen clippings that I recorded in Microsoft OneNote  (native to Mac) to show the problem do not cut and paste into Google Blogger.  Screw wasting time trying to fix that.

The words provide a clue, although the pictures would be even more helpful to anyone else with the same problem.

And the blank boxes - if they post - are a sad commentary on the state of data exchange.

---

(Outlook (Office15), on Windows 8.1, running under Parallels on a MacBook.)


Thursday, May 28, 2015

Using Outlook/Exchange email faster on IOS/Android than on Windows!!!!

I just sent email to my coworkers saying that I am about to give up on using Outlook on my PC (currently Windows 8), in favor of reading all of my company email on my iPad.

Why?

a) in part because my company's IT department requires us to use VPN to access our Exchange servers, requiring me to type in extra passwords and adding connection delays when the VPN connection times out due to inactivity - whereas they allow mail reader apps on cell phones and tablets, both Android and iPhone, to access without VPN, using ActiveSync.

(Note: I know that IT can configure Exchange to be accessed across the net without VPN - my company did so before it was acquired.  But apparently our IT department does not trust that, whereas it was compelled to provide ActiveSynch for mobile devices.
     I also have at least one email reader on my iPhone - Triage - that requires VPN.  A big reason that I do not use that email reader any more.
     Apparently Microsoft does not allow PC mail readers to use ActiveSync.)

b) also because of a recent Outlook problem, whereas it is unable to connect to the Exchange server until 15-30 minutes has passed - unless I reboot.


This probably is not big news to Microsoft - but this is something that Microsoft MUST fix, if they want people to continue using PCs.

All things being equal, I would prefer to read my email on my PC laptop, with its big screen, trackball, etc.  (Let's leave aside the question of Linux/Windows or which mail reader - Outlook or Thunderbird or ...  - since AFAIK all PC or Mac mail readers have the same problem, in that they would have to go through VPN to access the Exchange servers, whether as native Exchange or iMAP.) 

(Hmm, is there an ActiveSync mail reader that can work on a Linux PC?   Or is ActiveSync non-open software?)

But if using my PC is so much slower than using my iPad - well, that's one less Microsoft customer.

I have been considering a Microsoft Surface Pro or similar Windows 10 machine as a convertible tablet - capable of doing real work using PowerPoint and Excel.  But AFAIK the Surface Pro will suffer the above mentioned problems that slow down my PC email.   If it is slower than an iPad or Android tablet at reading email....   one less Microsoft customer.

Thursday, May 07, 2015

https vs http - why not signed but not encrypted https?


From: Andy Glew
Newsgroups: grc.securitynow
Subject: https vs http - why not signed but not encrypted https?
X-Draft-From: ("nntp+news.grc.com:grc.securitynow")
Gcc: nnfolder+archive:sent.2015-05
Date: Thu, 07 May 2015 11:57:35 -0700
Message-ID:
User-Agent: Gnus/5.13 (Gnus v5.13) Emacs/24.4 (darwin)
Cancel-Lock: sha1:3QSNHoOOsLInTT2t9aCFbf/tYoY=

(New user in grc.securitynow. Longtime podcast listener. Very long time
ago USEnet user (not so much nowadays).  My apologies if this is a FAQ.)

OK, so there's a trend to encrypt all traffic - to use https, to
discourage http.  If for no other reason than to make man-in-the-middle
attacks harder.

One of the big losses is caching: the ability for somebody like a school
in a bandwidth deprived part of the world (like Africa, now; like
parts of Canada, when I grew up, although no longer so true) to cache
read-only pages that are used by many people.   Like the website I used
to run, and which I hope to bring back up sometime soon - a hobbyist
website for computer architects.  No ads.  No dynamic content.

Heck, like this newsgroup would be, if it were presented as webpages.

HTTPS encryption, with a different key for each session, means that you
can't cache. Right?



Q: is there - or why isn't there - an HTTPS-like protocol where the
server signs the data, but where the data is not encrypted?

(I thought at first that the null cipher suite in HTTPS / TLS was that,
but apparently not so.)

Having the server sign the data would prevent man-in-the-middle
injection attacks.

An HTTPS-like handshake would be needed to perform the initial
authentication, verifying that the server is accessible via a chain of
trust from a CA you trust.  (Bzztt.... but I won't rant about web of
trust and CA proliferation.)



Possibly you might want to encrypt the traffic from user to server,
but only sign the traffic from server to user.



So, why isn't this done?


It seems to me it would solve the "HTTPS means no caching" problem.




OK, possibly I can answer part of my own question: signing uses the
expensive public key cryptography on each and every item that you might want to
sign.  Whereas encryption uses relatively cheaper bulk encryption,
typically symmetric key protocols like AES.

Signing every TCP/IP packet might have been too expensive back in the early days
of the web. Not to mention issues such as packet fragmentation and recombining.

But note that I say "each and every item that you want to sign".
Perhaps you don't need to sign every packet.  Perhaps you might only
sign every webpage.  Or every chunk of N-kiB in a web page.

A browser might not want to start building a webpage for display until
it has verified the signature of the entire thing.   This would get in
the way of some of the nice incremental fast rendering approaches.

But, perhaps the browser can incrementally render, just not enable
Javascript until the signature has been verified?   Or not allow such
Javascript to make outgoing requests?   I am a computer architect: CPU
hardware speculatively executes code befopre we know it is correct, and
cancels it if not verified.  Why shouldn't web browsers do the same?

I.e. I don't think latency of rendering should be an obstacle to having
cacheable, signed but not encrypted, HTTPS-like communication.

Probably the plain old computational expense would be the main
obstacle. I remember when just handling the PKI involved in opening an
SSL connection was a challenge for servers. (IIRC it was almost never a
challenge for clients, except when they opened too many channels to try
to be more parallel.)  What I propose would be
even more.



But:

(1) CPUs are much faster nowadays.  Would this still really be a
problem?

+ I'm a computer architect - I *love* it when people want new
computationally demanding things.  Especially if I can use CPU
performance (or GPU, or hardware accelerator) performance, which is
relatively cheap, to provide something with social value, like saving
bandwidth in bandwidth challenged areas of the world (like Africa - or,
heck, perhaps one day whden the web spans the solar system).

(2) Enabling caching (or, rather, keeping caching alive) saves power -
now I mean power in the real, Watt-hours, sense, while requiring
signatures and verifying them consumes CPU cycles.   I am not sure that
the tradeoff prohibits what I propose.

Wednesday, May 06, 2015

MacBook doesn't like surprises: UCB and display problems

Effing MacBook:



As usual, when I disconnect at home and come to work, I waste 15-30 minutes trying to get my displays and keyboard and trackball working.  If it was always the same problem I might have figured out a workaround - but the problem changes in a minor way.



Today, the problem was that when I plugged in at work, my external monitors worked, but my laptop LCD monitor was not working.  Black.  Not reported by System Preferences => Displays.



(Usually it is one or both of the external monitors that do  not work. And/or the USB keyboard and trackball.  But today it is different.)



Various attempts to fix, like unplugging the external monitors, going to sleep, etc., do not help.  Unplugging external monitors and going to sleep  / waking up[ => a laptop with a blank screen.



So, as usual, I rebooted.



At the moment, I am rebooting my MacBook once or twice a day.  Usual;l;y to try to get a display to work, or to get my USB keyboard and trackball working.



Note that I say "both" external displays above.   I only have two external displays, a 30" HDMI, and a USB BVU195 display adapter.   On Windows I used to have 3 or 4 external displays, as well as my laptop LCD.  Not so on the MacBoo.



I expected the Mac to handle multiple displays better that Windows?  Not in my experience! :-(



(MacBook Pro 15" retina Mid-2014)



---



I wonder if these problems are related to "surprise" disconnects - disconnecting from USB and external monitors without doing something to Mac OS-X first.   Windows used to have problems with such surprise disconnects circa 2000, perhaps Macs are just behind.  But I can't find any way to tell MacOS "I am about to disconnect you now".




Thursday, April 30, 2015

The Grammarphobia Blog: Why is “m” a symbol for slope?

The Grammarphobia Blog: Why is “m” a symbol for slope?: "In Mathematical Circles Revisited (2003), the math historian Howard W. Eves suggests that it doesn’t matter why “m” has come to represent slope.

“When lecturing before an analytic geometry class during the early part of the course,” he writes, “one may say: ‘We designate the slope of a line by m, because the word slope starts with the letter m; I know of no better reason.’ ”"



'via Blog this'