Technology related projects

Techdemo, 2012. 3D printing, 2012 Graph project, 2011 Structure from Motion, 2009,2007 Geodesics, 2009,2008 Flash PITF, 2004 Sofia AI project 2003/2004 CCP 2001 I,II

Technology notes 2017, 2007-2010, 2011-2015, 2016, Now

02-12-2017: A wise recommendation: keep paper backups for voting.
23-11-2017: A good article on the intel Management engine disaster. Nov 23. Some background about this troubling technology is given here. The talk of Joanna Rutkowska is here. It starts with "Personal computers are extensions of our brain. They are insecure and untrustworthy". An example of a well given presentation. We usually have assumed that hardware is trustworthy. One of her conclusions: today we can not assure secure boot. Rukowaka tells: "ME is an ideal backdoor and rootkiting infrastructure". It is part of "zombification" of computing: the hardware contains operating systems, which nobody can look at and which nobody can disable. Not even a secure OS like Qubes can prevent ME to take over.
21-11-2017: We are on the brink of a most terrible technology decision: the repeal of net neutrality. A NYT article puts it well: the internet might become a "pay for view" technology, at least in the US. Why a single person like the boss of FCC (a proven lobbyist of Telecoms can make such a decision on his own, is totally beyond me. It might lead to a much weaker US economy in the long term. There were other attempts of bad decisions recently, like health care changes which border at making it appropriate to call the lawmakers terrorists as it would have terrorized a large part of the population: (the definition is "the use of violence in the pursuit of political aims, religious or ideological change".) About 25 million would have lost insurance meaning the death of tens of thousands of Americans. Definitely much more than 911. (John Mc Cain with his famous midnight vote "thumb down" probably saved more lives than any general in the history of mankind). As taking away health care obviously kills people, it is an act of violence. It is not stabbing somebody to death, it is just watching the person bleed to death without doing anything and that is violence too. Changing net neutrality will not kill people, it will kill businesses. Maybe not world wide, as other countries are not that stupid. One can just say, it is not only idiotic, it is also deeply unpatriotic. Here is an article in "Entrepreneur" explaining a bit the small business aspect. And from the many cartoons:
17-11-2017: Just got one of these 5TB USB 3.0 Hard Drives for backup. 140 dollars. This is great for long term backups which are not overwritten. Having a growing digital library to backup requires larger capacities and 5 TB currently is enough for a full backup. Why is it important to have a local electronic library (books, music or movies)? First of all, the streamed content is changing. Netflix might offer a movie now but no more in a year. Streamed music changes. You might hear a song now. In the future, the song has changed, been modified to a "more modern taste". This could just be the beginning. It could be for example, that some movies are changed or modified, maybe because a scene has become too offensive, maybe because an actor is no more wanted to be associated to the movie and that part cut out. There are typically many versions of a movie available, extended versions, director cut versions etc. You might want to hold on to a version as in the future, with a streaming service that version might no more be available, or modified. In a dystopian future, we can imagine that electronic books are censored and changed. We are already in a time, that if reading books with electronic devices controlled by third parties, every of your readings are recorded and registered and statistically used (how long did you read which page, how much of a book did you read, where did you read it etc). Files can be modified, changed, censored, cleaned from possible politically incorrect parts (the taste changes with time) or offensive parts or parts with critique of a regime. We are already there. There are "clean versions" of movies available, where for example any violence is gone, where bed scenes are gone, where inappropriate language is cut out or "beeped out". Some documents might be deleted due to some legal or other quarrels. If you look at history or other parts of the world, there are many instances why access to information has been disabled, maybe because of ideological reasons. It happened in the past that documents bought on kindle were "repossessed". Music, text or Movie documents could in future, without you knowing it, be modified, or watermarked or modified or cleaned out. This already seems to happen with documents submitted to the "cloud". As the cloud provider does not want to store too many identical files, it might "replace" your file with an other "identical one". We don't know to get back our file or not a maybe a new different version? Some documents might in future just disappear. Decades ago, one had to burn books in order to ban them. Now, it is much easier and more subtle. Just modify the book and keep it available. A user bound to a dumb device like an ipad or kindle might not notice. If you own the file, you can OCR it, and compare text or sound or video. The technology to modify documents, pictures even movies have increased dramatically. So, thank you very much, but I keep buying my media (books, movies, or music) and back them up. A look at history shows that blind trust is not always best.
12-11-2017: A nice article about augmented reality ("data-vomit gush"). There is especially a link to a movie showing the first experiments with virtual reality by Ivan Sutherland from 1968. It is related to the MIT Lincoln Labs in Lexington. Sutherland is also known for Sketchpad. Currently, the European tech sites like the Reg or Heise kick ass. Heise just now has a nice article about how Face ID was cracked with a mask. By the way, when looking back at these historical videos, it becomes evident how much ahead universities (like MIT) have been at that time! Now, cutting edge technology is outsourced to companies. This happens also at an amazing speed also in higher education. They don't even try to fight. It makes of course sense financially to outsource IT, to outsource mail, to outsource course websites technology, even to outsource teaching. But it will soon be mean the end of a golden age of "higher education" as a place where innovation happens. Impossible? We have seen it happen in the automotive industry. It is not inconceivable that in 50 years, Boston is the new Detroit. If this looks ridiculous, just look at how far ahead the Lincolns Labs were in 1968. Companies like Microsoft (1975) or Apple (1976) were not even conceived then. P.S. There had been previous times, where industries were ahead of the game. IBM, Xerox or Bell labs come to mind, so, it is maybe not such a new thing. It is just the scale which is much different.
11-11-2017: Installed High Sierra on one of the macs. I think the system is now faster. No problems so far. Actually quite amazing as so much has changed under the hood. As I had performance issues with Keynote on my laptop, I also upgraded the laptop. Maybe it can now run Zoom and Keynote at the same time.
11-11-2017: A bit modified comment posted on this story: What I want from a programming language are Standard, Stability and Speed. Nobody minds the little quirks, redundancies or the lack of elegance. When I program something today, I want it to run in 10 years, without modifications! In particular, I want the language to be around still, the grammar once put stay a standard. I want the program to run stably. In particular, I expect developers to be VERY VERY careful when changing the compiler. Even small changes annoy. C has been quite good but recently, it was no more possible to run gcc -lm example.c . Linking the math library required gcc example.c -lm. WTF. One has to change now 700 Makefiles just because somebody thought this is more elegant? I don't mind if a language is extended or sped up, but don't for change old grammar, not even the smallest things. There is lot of code around which would need to be fixed. I'm in particular cautious when adopting new language, even if it is only a wrapper. They first hype and spike. In the worst case, the developer gets over excited and changes the language again and again. In the second worst case, the language gets abandoned. A language needs to earn respect, prove that it is stable over a long period of time, that it is reliable and fast.
06-11-2017: My Zoom Setup for teaching Math E 320: A picture from Monday, November 6, 2017. I had problems to run both Zoom and Keynote on the same laptop. I currently feed the slides from a second laptop which joins the meeting too. There is a large monitor attached which makes things also more comfortable. Click on the picture below to see it large (10 Meg file).
24-10-2017: Some extended comment to This register article: The analogy with utility is deeply flawed. Information is not a utility. It can be (1) sensitive and (2) crucial (3) requiring big pipe capacities and (4) require a healthy IT culture to be handled properly. We have played as a clueless kid on mainframes asking "mommy" (sysadmin) for computing time have been autonomously and educated and return now to the nursing home, paying the nurse (cloud provider) for every second of service (computing time).
  1. Information is not a utility. Water, gas or electricity do not contain possibly sensitive information, which needs to be protected. If a utility provider goes down, it is bad but not deadly. Losing data in a "cloud" or having data diffuse away to a third party, can kill a business as leaked information remains leaked for ever. If one of the major cloud providers loses control, it could even lead to a recession as many businesses would fail. Water, gas and electricity are information-free quantities, data files are not, they can be personal and crucial for a business.
  2. Information technology is vital. A power station going down or a water pipe gets repaired is a temporary inconvenience. Data loss or data leak is unrepairable and would be especially bad for financial, health and educational sectors. As a private person, I can survive for weeks without internet, electricity and gas, even water and still keep up essentially the same productivity. A modern laptop can be powered by solar, it is possible to work even in candle light and water could be bought in bottles. Such a resilience for IT is not possible with cloud IT.
  3. Information pipes are way too narrow An big problem with delegating IT to third parties is the internet infrastructure. Especially in the US, it is weak and expensive. The last mile is the main sore point. For any utility like water, gas or electricity, the capacity is not a problem. Now, with net neutrality currently dying in the US, it will even become worse. We will have to pay more, maybe even more for backing up large amount of data on a foreign data server.
  4. Lack of a healthy IT culture. A consequence of delegating things elsewhere is a loss of IT culture. In the short term, it can make sense as still, the cloud suckers dump the prizes to keep people hooked and destroy local IT infrastructures. Once dead it is difficult to build it up again and higher prizes are likely to follow. Yes, it is good that we don't have to uudecode an attachment by hand any more and that most computers now have almost zero maintainance, that backups can be automated onto a time machine etc but it also means for many institutions that the IT culture is shored out.
22-10-2017: The exhibit Can you hear the sound of a simplicial complex uses MP3 files triggered by mouse click. I first used "onmouseover" but sound is in general annoying in webcontent, when appearing unexpectedly. Most of this page was generated pretty automated. The eigenvalues of the matrices corresponds to the sound frequency. Mathematica generates the sound and image files.
17-10-2017: The limitation of twitter to 140 characters is a standard which should not be given up lightly. We have a new unit, "the tweet". If twitter will change it to 280 characters, it should be called differently, like a "roar". Limitation is an interesting challenge, especially in code. Sometimes, one has to fight a bit, like in this post on the energy theorem. I had to leave away the semicolons, after the definition of the connection matrix and the definition of the energy. But I wanted to cover the complex given at the beginning of the talk about this energy theorem. I think twitter would make a "cultural" mistake as 140 characters has become a "cult". I wonder what the tests will reveal.
26-09-2017: After upgrading Keynote, it started to have some hickups when exporting a movie. See here. Keynote has improved a bit the performance. When using Zoom, I have had terrible problems, almost bringing down my machine. I still now present from a second computer as Keynote sucked all resources from the machine (a brand new macbook). Unrelated is the problem that keynote uses a lot of resources with large presentations. I have problems running it on the same machine together with Zoom, while teaching. My solution is to run a second laptop on a second account, join the meeting from there. The second laptop is only used for presentation and has no video in zoom. This works.
09-09-2017: While looking up information on log tallys for an lecture in MathE 320, (see blog), I came across some papers made available in google books. Google books is a great project but starts to close up more and more. It needed some work to get this article and place it onto a local machine: screen shoot page by page and glue it together. The log tally is on the Wikipedia as well as on several blogs incorrectly attributed to Schenck because the google book document shows this book title. It would really have helped and prevented misunderstandings if the entire book could be downloaded as a PDF. It is a small thing but it contributes to a feeling that we live more and more in a time of "IT infantilisation": music and videos are streamed, not owned. Books need to be read in reader devices or software like kindle or "google play", where readers are tracked about their progress. Using software and media "as a service" one is evaluated and constantly monitored by a main frame server somewhere. It is a "cloudy business". Already major applications like "Google docs, Microsoft Word, Photoshop", calendar software, note taking software or backups. Heaven forbid that a user or "customer" has anything they "own". It is better to have the user as a child who needs a guardian to function. Even computer algebra systems have now cloud versions. I stopped using Adobe photoshop once it was "on the cloud" and would also stop using major CAS if they would go "cloud only". It is not only the users who have become kids who are constantly watched and controlled. At the moment, entire industries and universities outsource their IT structures. If the three main players Amazon, Google and Microsoft would cave, then not only their industries would disappear, essentially everything would collapse. The players have become too big to fail but are still not too big to merge. A brave new world scenario is where it is impossible to read or write anything without being tracked and marketed, where information is controlled by two to three players who due to lack of regulation and the few remaining players start to syndicate. Even more scary is the prospect of disappearing personal computing infrastructure for the home user as computing can only be done in smartphone like operating systems, where the user is jailed in or then is billed when using "computing as a service" on the "cloud". In such a world, a new player in the industry has little chance. Their innovative ideas are mined directly from the servers and fed into the artery of some giant. Not that people have to "borrow the ideas". It will be machines, trained with sophisticated algorithms that search through peta-bytes information trusted to a few servers. It is necessary to make as much information as possible public. But it should a also a matter of choice what becomes public or part of a third party and what not. A start-up, building up ideas, needs to be able to do that without being side lined by a large bully. Health data, start-up ideas, financial data or voting data need remain safe. One could imagine for example a software which goes through some cloud servers looks for new ideas and submits patent applications if something interesting has been found. In the near future, it could happen that "owning a file" on a local computer is technically impossible as the operating system is by design told to share everything with a central computer. A hack of a centralized system or a collapse of a data provider will be much more severe. Just two days ago, it was announced that the credit information of 143 Million Americans has been exposed. Certainly, "big data" analysts already have started to mine and sell this data, as it is very valuable. The "equifax super gau" prompts thinking about "decentralisation". There are data sets which need to be safe and off the public (like bank, credit, voting or health information) and then there are data which need to be free and public domain, like an article written one hundred years ago. What is needed? First of all bulletproof strong cryptology for industry and private folks (this already exists fortunately, but there are forces which try to take it away). Second, less centralization and more diversity in IT structures. Third, a healthy group of IT in each industry and university as well as a well educated general population who can stand on their own feet, handling their basic computing needs so that one can not become a hostage of a few giants, who if one is going down takes everything with them. It appears also healthy if copies of media are kept independently. A distopian future like Fahrenheit 451 is still a possibility. Technology has enabled to censor or change media content, not only text, also pictures and movies. Having only centralized "Cloud" versions would enable such manipulations. This already happens in various places on the world. ) [Update September 24: Cloud computing just has started to charge by second. It reminds me of an Encounter with Goldbach at a time "Main frame computing" = "cloud computing" had its first appearance. We were infants at that time. We again have become infants today. Anyway, it is psychologically bad especially in development and research to be billed by a service. If one makes the investment in local hardware, it is encouraged to do computations and use it to the fullest. With the service model, a researcher has to question every second of computing time. Mommy, do I get a dollar to do this computation?
18-07-2017: Links for a technology demo for today: An animated picture Strong lattice Fluid dynamics fluid Bubbles Vortex Sphere Surface cloud
11-07-2017: An important message of Vi Hart:
11-07-2017: I use my 12 inch macbook every day. Maybe 5-6 hours per day in average. Now 2 years old, there is now a battery service warning. Yes, the battery empties faster (5-6 hours now rather than 10) and looks fine but still, it seems that life will not last too much longer. Also the keyboard shows its time. I type a lot. Some keys lose their key marking which is not a big deal, others have started to become less reliable. I cleaned out some like the space key but removing it risks breaking up one of the tiny plastic latches (which happend to me). The keyboard would also need to be replaced. The risk is now here that one of the keys breaks for good making the laptop unusuable. I have done replacements of individual keys for mac air laptops before but it is quite expensive. To service the battery, 200 dollars, to replace the keyboard again at least 200, then the time to schedule appointments with the genius bar etc, a couple of hours and having the laptop not available for weeks. It would just not be feasable. I decided to use the still well working laptop now as a backup machine and get a new 12 inch one. The strategy to buy relatively cheap laptops but replace them regularly appears better than having an expensive one (Pro) but still face the same long term problems like battery, harddrive and keyboard, which just happen to fade after 2-3 years of heavy daily use. I use also the same strategy for bike which drives has at least 3000 miles per year. (I drive rain and shine, snow or heat, every day). After 2-3 years also, the bike starts to fail everywhere and servicing it costs half of a new one. Also here, "buy relatively cheap but replace often" appears to be more effective than having a really expensive one. Then there is the risk of having it stolen, which both for laptops and bikes are just there and which just would be devastating with 3 times more expensive laptop or 10 times more expensive bike.
04-07-2017: A vulnerability in RSA incryption illustrates that not only the mathematical security, but also the actual implementation is important. In this case it is the way how the modular multiplication is done. This allows to recover some of the bits. Important work as crypto security is crucial for a functioning society (banking, trade, health care, voting). See the Heise.
22-06-2017: Why does one use in HTML while TeX uses \infty? The discrepancy is kind of annoying. The infinity symbol was introduced in 1655 by John Wallis. But who is to blame for the incompatibility? I think it might have been HTML as the Unicode Consortium was incorporated in 1991 and the first versions built in 1986-1987. TeX was released in 1978. ASCII came earlier but does not feature the infinity symbol (which is kind of a shame if one looks at the other things which have been chosen instead: in the List of ASCII codes) . Apropos: the incompatibility between different languages is not a biggie. The extended ASCII flavours however were and we still have to suffer from the sins of coorporations trying to embrace and destroy competition and invented their own character or even ASCII versions. Still today, both in Adobe as well as in Word texts, one has characters like -, ", which look ASCII but are not. Platform specific character codes remain annoying. It is good that both the unicode and W3C consortium have got their grip together.
17-06-2017: Having switched my 4K monitor as a second monitor for the mac, I have tried a curved monitor (Dell UltraSharp U3415W PXF79 34-Inch). With a 3440x1440 resolution it does not match my 4K monitor with 3840x2160, but actually (maybe because my eyes also get older), I prefer to have a bit of a larger font while working. The widescreen (21:9) aspect ratio is very comfortable to work with. Here is a screen shot (click on the picture to see the full 3440x1440 pixel screen shot):
16-05-2017: A rare event: youtube is down. Interesting error message, (for google developers to debug): (click for larger picture) .
15-06-2017: A heise article illustrates how Etherum has heated up the crypto currencies. Ethereum is a gold rush, while bitcoin tanks (for now). These things are always a bit of a pyramide scheme but the block chain technology looks hotter as one can run code in decentralized applications. It also allows to build smart contracts. The Etherum virtual maachine is a turing complete software which can run any program it is kind of like a universal Turing machine. This makes it interesting in a more general sense. The Ether currency shows exponential growth ether or bitcoin.
10-06-2017: The SEO optimizers have become more sophisticated. It used to be stupid. But today, I got a personal email from a "math student" who for a "geometry project" needs to have a page linked to get "extra credit". Who does not want to help a student? The page however did not look like a project page. Yes, it had some information on it, but not done by a student and only remotely related to geometry. I asked back for the name of the school and the name of the teacher, but it was probably a waste of time. Must have been spam.
09-06-2017: Apple programs like Final cut, Garageband or iPhoto feature an annoying violation of "clean slate policy": the program by default starts with previous project loaded. This is sometimes useful yes, but annoying if one works on many different projects at the same time. Yes, one could organize produce different library, produce smart collections etc, but it is an attempt of the program to emulate part of the operating system. I don't want to rely on a program to get organized and personally like to start every program with a clean slate and that if I start with a project, then only the components of that projects are known to the program. Keynote, an other program of apple does this nicely. I can open a project "open presentation.key" and do not have to worry about other presentations or work with different settings etc. If I open a project "open project.finalcut", then the program should not know about older parts. Now, even if you use Final cut and move a project somewhere else on the hardware, the program will still find it and sometimes even load it. As I don't want to throw old projects, I put them into an other folder and make that folder invisible (chmod 000 backupfolder), then work on a new project. I do the same with the apple photo app. I'm not interested in pictures taken a month ago. I don't want to have them even somewhere in a library nearby. I want to start with a new film having organized the pictures I want to keep elsewhere. Also here, I now just put the old library in a directory and chmod 000 it so that the app does not pick it up. Similar violations of independence has started with the browser, where the program also wants a larger share of the operating system. I want the brwoser to start an independent process, in which I'm not linked in to services like google. I might work on different parts where in one browser I'm logged in for one project and on an other browser on an other project and are required not to know about each other. Mathematica also violates this policy when using the GUI. It does work well however if one uses Mathematica from the command line, a reason, I mostly work on the command line. This enclosure mentality is annoying and assumes that a user works on one thing only. For the webbrowser, I use now different browsers for different things to make it independent, like separating department work, administrative work, work for research, or work for teaching or work for family or then private work. It would be easy to fix. Whenever the user starts a browser new, it should start an independent process. Or one should be able to configure it as such. This is the default for most applications. Why is compartimalization important? It reduces the risk of mixing up things and adds more accountability, in case something goes wrong in one part. It puts the burden of organizing projects to the operating system level and not on the individual programs. Localization and decentralization simplifies and is more robust. It also produces "commutativity" of actions. Having everything loaded at the same time makes things depend on each other. An other reason is that most programs now communicate with some server, sending information forth and back. I'm waring different hats when working on different projects and don't want to have to change my computer to change from one project to the other. So, back to final cut: the last couple of days, I was uploading 30 hours of youtube videos for a conference (it is a project with a half of a TByte of movies). It is important not to get mixed up with different videos and renderings of different sessions. It is a time consuming process where not much can be automatized as rendering and uploading takes hours and because each video clip needs to be trimmed and annotated and because the uploads fail (probably every third upload needs to be redone or done several times, the reason being still mysterious. I first thought, it is the hard drives going to sleep problem [which is an additionally unrelated annoying feature of many external drives burned into the firmware so that one can only bypass it with helper programs touching every few minutes a file on the drive]. These upload failures happen also with an essentially fresh final cut setup. As I have Terra bytes of movies in my libraries, one could easily blame it maybe on the too large library. Now, I know it is a bug which must be blamed to the ISP sometimes resetting the network, or to a final cut instability or then a youtube problem. Strangely, it seems to be more frequent during the day than the night which would point to a network instability problem (Youtube just comments "upload canceled", the sharing progress usually stops around 51 percent). As usual in IT, it is the failures and limitations of tools which make up for the time consuming parts, as one has to find ways around these limitations or redo things. It is not like 40 years ago, where virtually everything in IT would fail first [When starting with experimental mathematics as a teenager, I had to store my first Basic programs on tape and often, even that basic saving process would fail but that was the norm]. Now, bugs have become rare, but they still eat most of the time resources. And because bugs are rarer, the are perceived also more annoying.
01-06-2017: An article in NPR about soundtracks produced by computer composers. This is fascinating. We have for a couple of times used Mathematica to compose. Examples> See also the lectures on Music and Calculus and AI.
31-05-2017: An other 5 hours of rebuilding my office machine. Since this is unpredictable, it is good to do this between semesters. While switching hard drives, one of the SATA cables broke off. It was the Sata connector to one of the important drives which got stuck in the part of the cable, killing both the cable and the drive. I got really mad because it was a nice new harddrive. I decided therefore to get one of these hot swappable harddrive containers (IStar 2BAY 2 x 5.25 To 3 X 3.5 Cage) and also got new sturdy SATA cables. Since my workstations are silent Thinkmate machines, I was worried that the additional vent would be noisy but the enclosure shields it well. I did also a fresh install of the operating system as my SSD has gotten old. Some minor surprises in Ubuntu 17.04: Perl ignores now by default local libraries and local files. An entry "export PERL5LIB=./:$PERL5LIB" in the .bashrc file solved this really annoying feature. It was the @INC variable which is set when Perl is installed and which does not look for local libraries any more. What were they thinking? An other hick-up with the ftp server which accepts pictures from the LAN webcam. I should have known. It is not the first time, but if the configuration file (here /etc/vsftp.config) has not the right permissions and not owned by root, the server does not start up (without complaining). Ubuntu now talks too much, everything if one does ssh's into the machine. The chatty motd scripts are in /etc/update-motd.d. One could delete them but they might be handy at some point. The easiest to shut this off is to edit a file /etc/motd containing what one wants to display. Now it just gives a line telling when and from where the last login was. Also took the opportunity to upgrade Mathematica.
26-05-2017: It is rare these days to get into the case sensitive trap on OS X. I regularly sync a work directory from my office machines with my laptop which has a non-case sensitive file system. If two files like g.pdf or G.pdf are present in the same directory, one will bite the dust. It is usually no problem but I just got bitten by this once more. I format external drives on the mac with case sensitive file systems but it might still be a risk to do that for the main drive. OS X is well done and almost perfect, but the case sensitivity is one issue which needs to be solved.
24-05-2017: Spent an afternoon with a strange bug on my home machine. For some reason, the ubuntu installer always produces a garbled screen. The machine is fine, the graphics card is fine and works both under linux and windows, the install media are fine (work on an other machine. I excluded USB problems by using various flash drives, or USB harddrives, tried out various other BIOS settings and then also used an other monitor. I currently suspect that it is a low level graphics card mode which is buggy, either on the motherboard or then on the graphics card. Any way, an afternoon gone.
12-05-2017: Our phones are now voice over IP. It is funny how the information leaflet mentions "voice mail service has moved to the cloud". Dudes, it is just VOIP, web, internet. But I guess, now everything has to be the cloud due to marketing reasons. One of the arguments against VOIP had always been redundancy and that things work even if the network is down. But as now most have cell phones, a traditional phone line in case of emergencies is no more so important.
26-04-2017: The registar of today mentions the plan of Ajit Pai (head of FCC) to kill net neutrality. No wonder, this guy was close to Verizon before going into politics. It would not surprise if he still is close to their lobby. Killing net neutrality could be one of the worst consequences of the Trump presidency which so far has a common theme: totally unqualified people are put into positions they never should be in. Even the relatively conservative "The Hill" calls it a "war on consumers". EFF calls the proposal "devastating for competition, innovation and free speech". Indeed, its consequences could be terrible both for the economy as well as for democracy. It is time to contact the representatives.
18-03-2017: The Google JPG encoder Guetsli is everywhere in the news. Here is the google blog and here is the paper explaining the iterative optimization. I could not compile it from scratch on an older ubuntu 14.04 but on OS X, it compiled well. A test with a first picture gave 7 percent reduction from 29981 Bytes to 27969 bytes. A compression with this picture did not go through yet. Probably too large. For the smaller version it took 40 seconds to reduce from 162589 to 120391 (35 percent). Not bad. But for the larger 12 Meg picture, a reincoding would take an hour. It would take days to reincode one of my panorama pages. It is not the first time that a swiss name has been used. There is also a Zopfli compression algoritthm by google. Why Swiss names? Some of the Google researchers like Jan Wassenberg are based at Google Zuerich. Wassenberg came from the Fraunhofer Gesellschaft, German research organization. The JPG 2000 data compression standard came from there which has, similarly than Guetzli a small compression advantage. It largely failed however because hardly any browser supports it (Firefox and Google chrome does not) and also because it is riddled by patents, which is a death sentence. [Update: In a test with a large panorama with 12 Megs, Guetsli worked for 2 hours and got out a 14 Meg version. The algorithm definitely seems have to have difficulty with very large files.]
12-03-2017: I do essentially all my work from a terminal. This is why crashes of the terminal app are especially annoying. I use xterm since about 30 years and it had always been stable, one all platforms, even on thin clients or over slow modems. OS X Sierra is the first exception. It is a known issue. I find it happening more frequently when editing files with long lines. Fortunately, unix apps like vim have built in recovery so that one does not lose data, when writing a program. It is still terribly annoying and the stability of fundamental apps like terminal should have the first priority. The issue has been known to apple since last fall.
26-02-2017: When trying to upload this clip it took only seconds for being banned from youtube. While this spoof was accepted (with adds), the Rammstein clip is seriously protected. One definitely has to accept evenso, I believe fair use still applies: no monetary part, no damage for the producer. It is maybe not sufficiently small.
18-02-2017: An alarming trend: IT job reductions in the US. Of course due to the increased centralization. Also universities have the trend and outsource more and more of the IT. It is sad, as it used to be that the IT developed and maintained at universities were on the cutting edge. It was encouraged to tinker and experiment with technology. Now things to to third parties, companies which can do things cheaper in a centralized manner, possibly abroad or in data centers where labor is less expensive. I personally believe this will come with a great revenge. First of all, the IT reduction trend is demoralizing for young students interested in tech. Actually this demoralizing effect could hit us in a few dozen years very hard and in many ways. But managers tend to think short term, even at universities. Here are a few reasons why thinking short term is dangerous: 1. It becomes already today harder and harder to convince a young person to pursue a STEM career. The myopia of leaders not understanding the fear of becoming obsolete and powerless has even led to Trumpism, a phenomena not even dreamed of one year ago. 2. Centralized data centers (lets call them the "revenge of main frame computing and thin clients") pave the way to a risky future. A meltdown of a major player now already would risk the operation of industries, university or even the economy. 3. We are still in a "buy in phase", where vendors dump the prizes to destroy competition and local IT structures. Once gone, everybody will be hooked and is required whatever prizes are prescribed by the soon to be monopoly. We see that already in internet access, where the prizes are unreasonably high, due to the lack of choice. 4. History shows how fragile a political landscape can be (and how important technology can be to manipulate). I think that the current IT structures built (a few big players controlling information) has made it already much easier that a totalitarian state can become a possibility (even in the US). There are some flood-gates and safeguards in place, one of them (on the technology side) is strong cryptology, an other the availability of open source operating systems, but there could come a time when it will be difficult for a new start-up to enter the field and compete as the important information structures are controlled by a few players who own the patents, the pipes and the power. But there is not only doom: we also live in a great time of technology. Our operating systems in desktops (like linux) have become rock solid. Even a personal data center of a dozen tera bytes of data have become cheap. And being able to carry around an entire library of books in the phone would have been just unthinkable 15 years ago. P.S. I just remembered that I made my first steps in computing on a main frame using thin client. This was convenient. I did not have to carry around floppies and backup things myself but it was also risky and indeed, for some reasons, I have lost most of the work done on that main frame. Only some printouts survived. I would give a lot to get back what I wrote at that time about finitely presented groups, to recover the projects done as a course assistant (like a cool project in which the students had to write an AI program solving the rubik cube. This means not just implementing a solution of the Rubik but finding a solution path using the Schreier algorithm!). To be fair, I lost also source code to many pascal programs I wrote as a student on the Macintosh or Atari. There it was just negligence to backup things properly or having misplaced the backup floppies. But I have still tens of thousands of pages of not -yet- digitized diary books with mathematics and programs rotting in the basement. Maybe I will once dig through that and scan a few things in.
28-01-2017: It is a complete disaster still with USB C hubs. There is none which allows charging and using other USB C devices at the same time. There is one product on the corner but pre-ordering is not so much my thing. I need to have something right now and guaranteed. It seems however to be a fact that there is nothing available which allows to use the Macbook access an external USB C drive additionally to charging. Currently, if I use my USB C drive for a backup, the battery will be drained until everything is backed up. I turned therefore for a network attached storage solution for my laptop backups (and then of course, the local syncs from laptop to desktop which are done with "rsync").
23-01-2017: A great Ode for VIM, and especially its attitude to innovation and features. And especially the urge of developers for UI rewrites or features changing the workflow or worse, compatibility. I myself write everthing in vim, from simple notes, latex documents, html documents, programs (even if programming languages offer their own file format) etc. VIM is now 25 years old. It has improved quite a bit. A decade ago, I started to warm up for syntax coloring.
10-01-2017: Some pictures showing the strange green spot. Obtained in Panoramas. Looks Spooky. But has an explanation. Just strange that in that case, the flares always appeared to come from the same spot in the meadow....
06-01-2017: Just upgraded the phone. A first test of the camera of the iphone 7 in the Boston Library Compare the pictures in Blockisland which were done with the iphone 6 and still had 40 Megapixel. The new panoramas have 60 megapixel.

Oliver Knill, Department of Mathematics, Harvard University, One Oxford Street, Cambridge, MA 02138, USA. SciCenter 432 Tel: (617) 495 5549, Email: knill@math.harvard.edu Twitter, Youtube, Vimeo, Linkedin, Scholar Harvard, Academia, Google plus, Google Scholar, Ello, Webcam, Fall 2017 office hours: Mon-Fri 4-5 PM and by appointment.