Cutting the Azure Web Services Bill by 90%. HOWTO!

An organization for which I am a contractor had an annual Azure bill that totalled more than $100,000 per year. After re-provisioning, we reduced the annual bill to less than $13,000. Sounds good to you? Here are some tips.

1. Avoid the Azure-provided database servers, especially Cosmo DB.

The charges are fantastically high for SQL (and noSQL) services on Azure. Cosmo DB charges by the number of collections you use, not the amount of storage. We had one project for which the CosmoDB bill was $22,000 per year! It had 11 collections in it, each with fewer than 5 documents. I almost fainted.

What to do instead? Create a Linux virtual machine and run all of your file and SQL oriented services from that machine.

It is quite easy to install and run MongoDB, PostgreSQL, and MySQL. The Linux administrator skills required are, well, very meager. If your administrator has installed Linux, say, on 5 different workstations during the last few years, and fiddled with configurations and user requests for packages, I expect your administrator can handle this.

The only interesting part of this exercise came up in the administration of PostgreSQL. For whatever reason, the default settings are not optimized for speed on data set queries. After a couple of hours of panicked Googling and test adjustments in the config file, our PostgreSQL server instance was running faster--yes, faster, than the Asure-provided SQL server.

Another benefit of doing this is that your companion devices, which you keep in the same region, in the same virtual net, can interact with your file/SQL server quickly. We set up an SSH tunnel between the systems and it really speeds things up. And makes them more secure.

2. Avoid using the Azure AppServices framework.

The app server prices are scaled to the demands on your app. An app that has any serious activity can cost you hundreds of dollars per month.

It is a better idea to buy a VM and run your own app server.

Most of the apps I needed to administer were Node.js or Python Flask apps. As developers know, these apps are typically developed with a small test web server that starts with a warning "this is not a production quality web server. Use a secure WSGI server."

I had never done this before January, 2020, and it was not too easy. I spent about a week googling around for alternative methods before I eventually settled in a "middleware" server called Phusion Passenger. Passenger is offered as a free software, but there is a commercial version that has a few features. One feature that attracted me was that Passenger is multi-lingual. It can host Python, Node.js, Ruby/Rails, and others. And the instructions are truly excellent.

The best web server at the current time is Nginx (say "engine X"). Nginx acts as a "reverse proxy server" that receives web visitors and assigns their requests to the various Passenger configurations.

3. Change billing from pay-as-you-go to reservations.

This is a little tricky, Microsoft does not make it very easy to figure out. If you are willing to make a commitment of 3 years to use a resource of a given size (a VM with given number of CPU and memory), the price per month will drop. The price drop will be at least 30%, but maybe more.

There's a penalty for early withdrawal, as they say, but it is not severe. If you stay with the reservation for one calendar year, you would break even if you cancel the service.

Keep in mind that the reservation is not for a particular machine, it is for a particular class of machine. So if you create a reservation for one machine project, and that project is killed off, you can remove that VM and start a new one of same type and it inherits the lower reservation price.

4. Buy a rack server for GPU calculations

The Azure price for machines capable of GPU calculations is extraordinarily high. We found a much better option. There's a local company called Stallard Technology (STI) that sells Dell rack servers. Some are brand new, some are factory reconditioned.

We bought a rack server that had 2 Nvidia GPU devices in it for about $4300. It runs fine. I set up Ubuntu with Tensor Flow on that system and it generates results more quickly than the Azure GPU system did. After a few months, we will have saved enough on the Azure GPU bill to pay for the server.

The conclusion: Azure VM are handy devices that you can afford, if you are willing to run your own services. A lot of money can be saved.

Posted in Linux | Leave a comment

Building R-4.0.0 in a side folder

R 4.0 was released over the weekend and immediately I got an email that said one of my packages does not build anymore with R-4.0. The email from R core said I have 2 weeks to bring the package into compliance. The Ubuntu Linux repositories do not have R 4.0 yet, so I built my own. This is something that I've done many times over the years, but I expect many Linux newcomers have not done it. So I'll copy/paste some notes in case it helps.

In all of these details, there are two especially important things to know. First, build R in a separate folder, not in the original source code folder. Second, specify the prefix so that the newly built version of R does not interfere with the previous one that still exists in the /usr file hierarchy.

First, on CRAN, download the file R-4.0.0.tar.gz. Put that in an out-of-the-way folder and decompress (I ran tar xzvf R-4.0.0.tar.gz). That crates a folder R-4.0.0.

cd into that directory and make a new subdirectory called build (mkdir build). We are going to compile R in build in order to prevent the build process from altering the R source code itself. Then cd into build.

The usual process to build GNU software packaged in this traditional style is to run a 3 step process

  1. configure
  2. make
  3. make install

The first thing we do, usually, is to inspect the configure options, with a command like

../configure --help

I want to install my new R-4.0.0 in a separate part of the file system, one where I can read/write as an ordinary user. So the VERY important option is --prefix=/tmp/R. In most softare, it would be sufficient to run this in the build directory:

../configure --prefix=/tmp/R

That would configure the software and so when finishe, it is installed in /tmp/R. However, as luck would have it, I tried that and it failed because the R system tries to write files in /usr/share even after I set the prefix. That was a little unusual.

On Ubuntu, the R packages are built by experts who know what to do. I looked at the file provided with R-3.6.3 called /etc/R/Makeconf to find out what options they used to build R. Their configure statement seems to be quite massive and I played with it for a while. In the end, I decided to take their version,

# R was configured using the following call
# (not including env. vars and site configuration)
# configure  '--prefix=/usr' '--with-cairo' '--with-jpeglib' '--with-readline' '--with-tcltk' '--with-system-bzlib' '--with-system-pcre' '--with-system-zlib' '--mandir=/usr/share/man' '--infodir=/usr/share/info' '--datadir=/usr/share/R/share' '--includedir=/usr/share/R/include' '--with-blas' '--with-lapack' '--enable-R-profiling' '--enable-R-shlib' '--enable-memory-profiling' '--without-recommended-packages' '--build' 'x86_64-linux-gnu' 'build_alias=x86_64-linux-gnu' 'R_PRINTCMD=/usr/bin/lpr' 'R_PAPERSIZE=letter' 'TAR=/bin/tar' 'R_BROWSER=xdg-open' 'LIBnn=lib' 'JAVA_HOME=/usr/lib/jvm/default-java' 'R_SHELL=/bin/bash' 'CC=gcc -std=gnu99' 'CFLAGS=-g -O2 -fdebug-prefix-map=/build/r-base-5i6V25/r-base-3.6.3=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -g' 'LDFLAGS=-Wl,-Bsymbolic-functions -Wl,-z,relro' 'CPPFLAGS=' 'FC=gfortran' 'FCFLAGS=-g -O2 -fdebug-prefix-map=/build/r-base-5i6V25/r-base-3.6.3=. -fstack-protector-strong' 'CXX=g++' 'CXXFLAGS=-g -O2 -fdebug-prefix-map=/build/r-base-5i6V25/r-base-3.6.3=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -g'

and I trimmed it down to something I could understand. I don't want to use build options that I don't understand. I will trust R's configure script to guess defaults correctly.

../configure '--prefix=/tmp/R' '--with-cairo' '--with-jpeglib' '--with-readline' '--with-tcltk' '--with-system-bzlib' '--with-system-pcre' '--with-system-zlib' '--mandir=/tmp/share/man' '--infodir=/tmp/share/info' '--datadir=/tmp/share/R/share' '--includedir=/tmp/share/R/include' '--with-blas' '--with-lapack' '--enable-R-profiling' '--enable-R-shlib' '--enable-memory-profiling' '--with-recommended-packages'

When that finished, there was a message indicating that I had the required software and that recommended packages would be installed.

Compiling R (with make) is time consuming (perhaps 8-10 minutes one core) and it is faster if you ask make to try to parallelize the build. I allowed 6 cores for that.

make -j6

When that was finished, the last step is to run make install. However, because I diverted the R info and data files to /tmp/share, I needed to prepare for that. I created the folder /tmp/share.

Then the all important installation

make install

Now, when I want to use R-4.0.0, I need to adjust the system path interactively. In the terminal, I run this to put the new R at the FRONT of the path.

export PATH=/tmp/R/bin:$PATH

To use the the system-wide R-3.6, I need to open a different terminal where /usr/bin/ will be at the front of the path.

Behold:

$ R

R version 4.0.0 (2020-04-24) -- "Arbor Day"
Copyright (C) 2020 The R Foundation for Statistical Computing
Platform: x86_64-pc-linux-gnu (64-bit)

R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or 'licence()' for distribution details.

  Natural language support but running in an English locale

R is a collaborative project with many contributors.
Type 'contributors()' for more information and
'citation()' on how to cite R or R packages in publications.

Type 'demo()' for some demos, 'help()' for on-line help, or
'help.start()' for an HTML browser interface to help.
Type 'q()' to quit R.
> sessionInfo()
R version 4.0.0 (2020-04-24)
Platform: x86_64-pc-linux-gnu (64-bit)
Running under: Ubuntu 19.10

Matrix products: default
BLAS:   /usr/lib/x86_64-linux-gnu/atlas/libblas.so.3.10.3
LAPACK: /usr/lib/x86_64-linux-gnu/atlas/liblapack.so.3.10.3

locale:
 [1] LC_CTYPE=en_US.UTF-8       LC_NUMERIC=C
 [3] LC_TIME=en_US.UTF-8        LC_COLLATE=en_US.UTF-8
 [5] LC_MONETARY=en_US.UTF-8    LC_MESSAGES=en_US.UTF-8
 [7] LC_PAPER=en_US.UTF-8       LC_NAME=C
 [9] LC_ADDRESS=C               LC_TELEPHONE=C
[11] LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=C

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base

loaded via a namespace (and not attached):
[1] compiler_4.0.0

As soon as the Ubuntu repositories have R-4.0, I'll switch to that version. But the lesson here may still benefit you in the long run. When a package is submitted to CRAN, one must sign an oath that the package installs cleanly on the CURRENT version of R as well as the DEVELOPMENT version of R. Now that you know how to build R from a tar archive, it is not so bothersome to install R-devel and test your package.

Posted in Uncategorized | Leave a comment

Creating a new SSH key with ed25519 encryption

A while ago, I prepared notes about creating an SSH key for use with Gitlab (and other SSH-based servers). That guide, https://pj.freefaculty.org/guides/crmda_guides/34.gitlab/34.gitlab.pdf explains the basic ideas of SSH keys.

Today I learned that when interacting with an SSH server, the USER can control, to a significant extent, the type of security that is used. This is because the USER creates the SSH key used for key-based authentication and keys can be encrypted in several different formats.

My "old" key (from last year) was created with the RSA encryption. RSA was an upgrade over DSA. So when I log into a remote system that supports several protocols, the system notices my key is RSA.

You can see that for yourself in the shell if you run ssh -v your_server_name_here. In the output, you'll see what encryption your session is using and there will also be info about what's possible. I see

$ ssh -v kauffy
debug1: kex_input_ext_info: server-sig-algs=<ssh-ed25519,ssh-rsa,rsa-sha2-256,rsa-sha2-512,ssh-dss,ecdsa-sha2-nistp256,ecdsa-sha2-nistp384,ecdsa-sha2-nistp521>

debug1: kex: host key algorithm: ecdsa-sha2-nistp256
...
debug1: Will attempt key: /home/pauljohn/.ssh/PJ_kauffeykey_20190624 RSA SHA256:684fc0VEO/glopItyGook explicit agent

The server supports both ed25519 and ecdsa. The former is preferred. And when people want to create keys that will interact with my systems, I will ask them also to create ed25519 keys.

First, I make sure my system's ssh-keygen function is able to do this. Check the help page:

$ ssh-keygen --help
SYNOPSIS
 ssh-keygen [-q] [-b bits] [-t dsa | ecdsa | ed25519 | rsa] [-N new_passphrase]
                [-C comment] [-f output_keyfile]

This version of ssh-keygen can work with dsa, ecdsa, ed22519 and rsa. In the old days we had dsa. Now, dsa is discouraged/insecure and should be removed from the list. Default keys use rsa. Now rsa is considered adequate, but not as good as the others. Now we want ed25519, or, if the server does not support that, I would need ecdsa. So I might as well create a new key for each one.

I always specify the key file name, so I can tell which key goes with which website. Today I did this ( I cd into the ~/.ssh folder so the key file will end up where I want it to be.).

$ cd .ssh
$ ssh-keygen -a 100 -t ed25519 -f "PJ-ed25519-20200415" -C "PJ-ed25519-20200415"
Generating public/private ed25519 key pair.
Enter passphrase (empty for no passphrase):
Enter same passphrase again:
Your identification has been saved in /home/pauljohn/.ssh/PJ-ed25519-20200415.
Your public key has been saved in /home/pauljohn/.ssh/PJ-ed25519-20200415.pub.
The key fingerprint is:
SHA256:dzhTwEx1KDcx87hNFqKdP8ZX4JlpsnneVGalQc1sSXM PJ-ed25519-20200415
The key's randomart image is:
+--[ED25519 256]--+
|         +o.*+B*E|
|          +++X XO|
|          .o*.@.=|
|           o % oo|
|        S = = B o|
|         . + + = |
|              . .|
|                 |
|                 |
+----[SHA256]-----+

The -C flag gives a comment that is saved at the end of the public key. Many people put their email there. I would rather put a reminder to myself of which key this is. I specify a file name with -f. Again, many people ignore the name, but it is important to me to know which key is which, and I put a format and a date. The -a parameter asks for the algorithm to apply a lot of iterated distortions to the security system to discourage brute-force password cracking efforts.

I have one project where the server does not yet allow ed25519 keys, so I also need an ecdsa key. For this one, I specify the bits parameter at the maximum value:

$ cd ~/.ssh
$ ssh-keygen -b 521  -t ecdsa -C "pauljohn_ecdsa_20200415" -f "PJ-ecdsa-20200415"
Generating public/private ecdsa key pair.
Enter passphrase (empty for no passphrase):
Enter same passphrase again:
Your identification has been saved in PJ-ecdsa-20200415.
Your public key has been saved in PJ-ecdsa-20200415.pub.
The key fingerprint is:
SHA256:ZakN2sprQk6hUwk5IGVTKQI2pP10OKMReY03QYgHn9E pauljohn_ecdsa_20200415
The key's randomart image is:
+---[ECDSA 521]---+
|****oBo.         |
|=oO+*+E    .     |
|..o*Bo... +      |
|   =++ o *       |
|  .o... S .      |
|  o o. .         |
|   =  o          |
|    o ..         |
|     o.          |
+----[SHA256]-----+

After that, I've got 2 new keys, each with public and private parts:

$ ls -la
[snip]

-rw-------   1 pauljohn pauljohn   801 Apr 15 14:17 PJ-ecdsa-20200415
-rw-r--r--   1 pauljohn pauljohn   277 Apr 15 14:17 PJ-ecdsa-20200415.pub
-rw-------   1 pauljohn pauljohn   464 Apr 15 14:16 PJ-ed25519-20200415
-rw-r--r--   1 pauljohn pauljohn   101 Apr 15 14:16 PJ-ed25519-20200415.pub

There is a problem that you will run into if you have several different SSH keys. If you try to log in on a server, it will look through your keys and try one at a time. The server is configured to check a few of your keys and then give up. The way to solve that is to tell the server which authorization key should be used when you log in on the website.

There are two ways to do this. First, on the command line, add a parameter to tell ssh which key to use. For example, ssh -i ~/.ssh/PJ-ed25519-20200415 would work. If the path to the key has any spaces or special characters (unwise!), you'd need quotes on the key's name. But don't be a silly person and create a key with spaces or symbols like * or &. That's just wrong.

Second, the method I actually use is to create a stanza in my SSH configuration, a file called ~/.ssh/config. These configurations are just cut-and-paste things, nothing fancy. Here's the top of the file

Host subversions.gnu.org
     Protocol 2

Host kauffy
     HostName 42.73.187.92
     User pauljohn
     IdentityFile ~/.ssh/PJ-ed25519-20200415
     KeepAlive yes
     ServerAliveInterval 10
     IdentitiesOnly yes

On the server, I need to insert the content of my file ~/.ssh/PJ-ed25519-20200415.pub into the file called ~/.ssh/authorized_keys. This can be done in various ways. Because I'm old, I use scp to transfer the key to the server, then simply add the key to the file with cat. On the server, run:

$ cd .ssh
$ cat ~/PJ-ed25519-20200415.pub >> authorized_keys

Other people use wrapper scripts that can transfer the key pub file to the server an append it to authorized_keys. Perhaps your OS has a shortcut like that. But I don't see any reason to rely on somebody else's scripting magic when this is a perfectly understandable and doable thing for an average user. There's no magic in it.

After this, whenever I want to ssh connect with the server, I run in the terminal

$ ssh kauffy

After all this, I better make sure it actually uses ecdsa encryption. So I go to my client computer and run

$ ssh -v kauffy

and in the output, I find comments indicating that it did work:

debug1: identity file /home/pauljohn/.ssh/PJ-ed25519-20200415 type 3
debug1: identity file /home/pauljohn/.ssh/PJ-ed25519-20200415-cert type -1
...
debug1: Offering public key: /home/pauljohn/.ssh/PJ-ed25519-20200415 ED25519 SHA256:gobbeldyGook explicit agent
debug1: Server accepts key: /home/pauljohn/.ssh/PJ-ed25519-20200415 ED25519 SHA256:gobbeldyGook explicit agent
debug1: Authentication succeeded (publickey).

Special thanks to Wes Mason at KU's ITTC for suggestion to try ed25519 and, failing that, ecdsa.

Posted in Linux | Leave a comment

R package semTable-1.7 uploaded on CRAN

semTable-1.7 includes a bug fix for tables that include variances estimated between latent variables and exogenous predictors. It has been proposed to CRAN.

In the meanwhile, to avail yourselves, try this to put KRAN (our test server) at the front of your repo search path:

CRAN <-"http://rweb.crmda.ku.edu/cran"
KRAN <-"http://rweb.crmda.ku.edu/kran"
options(repos =c(KRAN,CRAN))
install.packages("semTable", dep=TRUE, type="source")

The parameter 'type="source"' is required for download of the newest from KRAN, but when the new version percolates into KRAN, it will not be necessary. The CRAN auto-processor has accepted the proposed package and usually it takes 2 days for versions to become available for Linux, Mac, and Windoze systems.

In case you have not tried semTable yet, it can make presentable tables of structural equations estimated in R with lavaan. My initial focus was LaTeX tables, but now it generates HTML and CSV files as well. There is a vignette distributed with the package. I uploaded a copy here for your review.

http://pj.freefaculty.org/scraps/semtable.pdf

Posted in Uncategorized | Leave a comment

Does mathjax-latex work? Not entirely if Markdown is enabled.

I have updated to PhP 7.2 FastCGI and WordPress 5.2. I want to use WordPress to display markdown documents that have some latex, so I've installed a plugin for markdown which seems to work. However, I'm getting mixed results with "mathjax-latex". Perhaps markdown and mathjax-latex were not meant to cooperate, I will need to do more testing.

The main issue is that the LaTeX markup that does work in a markdown document that we export to HTML with, say pandoc, does not work "as is" with mathjax-latex. In particular, to get in-line math, special markup seems required.

I can easily show the result of failed efforts, but have difficulty showing the markup. The ordinary markdown code tricks, like indenting four spaces or enclosing in ```, seem not to protect code from latex interpretation.

For inline math, the single dollar sign does not work. Observe, you see dollars $x_i$. However, writing the words "latex" within hard brackets "[" does work, along with "["/latex"]". The symbol "["latex"]" is converted to slash-paren, "\ (". Also I can explicitly type a double escaped "\\ (". Here is a gamma inserted inline with the first method: \(\gamma\), and using the second method \(\gamma\). Here is another Einstein favorite with the double-backslashed parenthesis: \(E=mc^2\), but the single dollar sign is a dud $E=mc^2$. However, if I literally write '$latex' at the beginning, it works: \(E=mc^2\).

Here is an example where the dollar sign notation does not work: This x sub i is at the end of a line: $x_i$. That is a big disappointment because all of my example documents use that notation. Just now I wonder about using escapes, '\$x_i\$'. Fail!

But if I put double dollar sign xsub i on line by itself, we get a displayed equation,

$$x_i$$

Double dollar signs work to get a display equation works OK if the dollar signs are on lines by themselves, which is happy news.

$$
y_{i}\label{1b}
$$

Bad news: the preferred LaTeX markup "\[" does not work to give an inline equation:

[
E = mc^2
]

To make that work, apparently I need a double back slash. That's unfortunate.

Here is a double back slash hard bracket, to show it can work:

\[E-mc^2\]

I use double backslashed "\begin{equation}" to try for a labeled/numbered display:

\begin{equation}
\gamma\label{eq:gam}
\end{equation}

No number, and label doesn't work, trying to cross-reference, equation(\ref{eq:gam}) fails.

So, LaTeX that begins with a backslash apparently needs a double backslash, but that does not work for all \LaTeX, as you see from the fact that LaTeX is not displayed with any special markup.

Posted in Uncategorized | Leave a comment

Finding and Recovering lost files in Git

One problem in git is that when a user deletes a file accidentally, git acts as though the user intended to do that. The next time the user runs "git commit -a" then the list of transactions will include the formal deletion of the file. People who don't inspect the file list, or who carelessly run "git commit -a -m 'your message'" will delete files.

Today I had that situation arise and I have a brief report on how I fixed it. If you don't know for sure what the file path was, first scan all deleted files to see if you spot the ones you want:

$ git log --diff-filter=D --summary | grep delete

If the list is huge, grep for an element in the file name or path you are looking for. In my case, the missing file was an ogv file:

$ git log --diff-filter=D --summary | grep delete | grep ogv

delete mode 100644 36.LaTeX_Overview/LaTeX-Overview.ogv
delete mode 100644 37.LyX-for_LaTeX_homework/LyX-LaTeX_homework.ogv
delete mode 100644 46.windows_R_setup/46.windows_R_setup-slides.ogv

I had 3 files to restore. Now I know their full paths.

Find out which commit removed the files (I've sanitized the name of the author of the commit):

$ git log --all -- 37.LyX-for_LaTeX_homework/LyX-LaTeX_homework.ogv
commit b67eaa335379d5bb5e1187482d3647000426644d
Author: Anonymous
Date: Thu Sep 27 15:43:43 2018 -0500

Finished the installing dependencies and R sections.

There are 2 ways to recover files. These files happened to be in an LFS-enabled git repo, so the first method I tried retrieved the "reference" version of the file. I was not sure I could trust that. This method retrieved the actual ogv file:

$ git checkout b67eaa33^ -- 37.LyX-for_LaTeX_homework/LyX-LaTeX_homework.ogv

The ^ means "the one before this" because the one before this commit was the last one that had the file.

The other method of retrieving uses git show. In my git instruction manual (crmda.ku.edu/guides), we have illustration of the show method.

Posted in Uncategorized | Leave a comment

Building a new xfwm4 Debian package: Try it, you’ll like it!

In the askubuntu web forum (http://askubuntu.com/questions/821940/configure-mouse-buttons-in-xfce4), I asked if it is possible to change the behavior of the mouse in response to  Alt-Middle click. Currently, that causes a window to be moved to the bottom of the stack of windows. On a new Dell precision 5510, the trackpad makes it difficult to know where the left button is separated from the middle button and I often accidentally get the middle when I want left.

There were no answers for a while, so I asked same question in the XFCE4 support forum. https://forum.xfce.org/viewtopic.php?pid=43174#p43174

The answer is that the Alt-Middle click behavior (pushes window to bottom of stack) is hard coded in the xfwm4 source code. Only way to change it is to recompile xfwm4. Helpful person pointed at src file events.c line 928 for revision. I found easy to build new package and now have xfwm4 behaving the way I want!

In case you have not tried this for yourself, the process is much more straightforward than one might expect. It seems quite a bit better than it was just 18 months ago. In brief, get the source for the package, fiddle the source code, try to rebuild, let it make a patch, then build again.

$ sudo apt-get build-dep xfwm4
$ mkdir -p tmp/xfmw4
$ cd tmp/xfwm4
$ apt-get source xfwm4 
$ cd xfwm4-4.12.3/src
$ vi events.c

In that file I made a correction that amounted to this patchfile. It looks like the spaces were destroyed in cutting an pasting, but if you open events.c and go to line 928, you see there is one line I've commented out and one I've added.

+++ xfwm4-4.12.3/src/events.c
@@ -925,7 +925,8 @@ handleButtonPress (DisplayInfo *display_
}
else if ((ev->button == Button2) && (state) && (state == screen_info->params->easy_click))
{
- clientLower (c, None);
+ /* clientLower (c, None); */
+ button1Action (c, ev);
}
else if ((ev->button == Button3) && (state) && (state == screen_info->params->easy_click))
{

This causes the Alt-Middle and Alt-Left behaviors to be the same. I fiddle those lines in the source, then run

$ dpkg-buildpackage -rfakeroot

In new version of this program, there is a very handy feature. The builder notices you edited the file and it makes a patch for you and puts the patch in the debian directory, under patches. Read the output, it will be obvious what to do.

$ dpkg-source commit

Before running the builder again, edit the changelog to update the version. The easiest way is to use the helper named "dch"

$ dch

Opens a dialogue where you need to make sure you are incrementing the package name, so that when you install the new xfwm4 you will build, it gets a new number.

Then run the builder again

$ dpkg-buildpackage -rfakeroot

If all goes well, then the new packages will be in the directory above.

$ cd ..
$ sudo dpkg -i xfwm4_4.12.3-1ubuntu4_amd64.deb
$ xfwm4 --replace &

You'll see the effect of the change right away.

If you've never rebuild a Debian/Ubuntu package, you might as well download the source and give it a try. This is one of the most satisfying parts of being a Linux user.

This change in xfwm4 helps me quite a lot because this touchpad is, well, very difficult to use. Without looking down at the touchpad, I find it impossible to know for sure where the left button area ends and the middle button begins. My right thumb does not always reach far enough to find the left button. By making the left and middle button alt-click behaviors the same, I reduce the error rate quite a bit.

In caveman talk, I'd say to Dell: "trackpad bad! buttons good!". I know I'm out of style here.

Posted in Uncategorized | Leave a comment

Building R-devel on RedHat Linux 6

 

Warning: I'm 85% done with this, formatting is not right. I DO NOT want to type in the prompt in front of every command because then one cannot copy/paste directly.  However, copying some output chunks picks up the dollar signs and I'm inconsistent.

Brief Summary: R-devel will not build without access to newer zlib, bzip2.  This is a problem because CRAN requires users to test packages against R-devel, not against existing R or R-patched. This note has step-by-step information about what was necessary to compile & install R-devel in my user account on a cluster compute environment running RedHat 6 Linux.To upload an R package, one must agree to compile the package against R-devel, the cutting edge version of R.

The 2016 current version of R-devel has removed the versions of several compression libraries that used to be included. Instead of providing those libraries, R-devel supposes they are installed in the operating system. On my up-to-date Ubuntu laptop, this was not a concern because I have up-to-date versions of zlib, xz, pcre, and curl.On the compute cluster, which is still running RedHat 6, it is a more serious problem because the libraries zlib, bzip, pcre, curl, and xz are out of date. We find that out because when we try to build R-devel, it fails and tells us what is out of date.

As a result, one cannot configure and compile R-devel. One must get updated libraries. If the system administrators would replace all of those libraries, we could go ahead.However, in a Unix system, it is possible to compile and install support libraries in a user's account, without system-wide intervention. With the exception of bzip2, where theinstallation is a non-standard setup, the installs of zlib, xz, curl, and pcre are standard and easy. Building R-devel on a Linux system with slightly older packges.

Here is the process I went through on RHEL6 to make this go. Special thanks to Wes Mason at KU ITTC who provided the critical ingredient.

1. Our cluster defaults to an ancient version of gcc.   I can tell my environment to use the newer gcc compiler

$ module avail
$ module load gcc/4.9.22

2. Try to build R-devel without making any special preparations.

mkdir src
 cd src
 wget --no-check-certificate https://stat.ethz.ch/R/daily/R-devel_2016-02-11.tar.gz
 tar xzvf R-devel_2016-02-11.tar.gz
 cd R-devel
 ./configure --help
mkdir builddir
 cd builddir
 ../configure --prefix=$HOME/packages/R-devel '--with-cairo' \
 '--with-jpeglib' '--with-readline' '--with-tcltk' \
 '--with-blas' '--with-lapack' '--enable-R-profiling' \
 '--enable-R-shlib' \
 '--enable-memory-profiling'
## fails ignominously:
 checking if zlib version >= 1.2.5... no
 checking whether zlib support suffices... configure: error: zlib
 library and headers are required

3. Install zlibDownload, un-tar, configure, compile

cd ~/src
 wget http://zlib.net/zlib-1.2.8.tar.gz
 tar xzvf zlib-1.2.8.tar.gz
 cd zlib-1.2.8
 ./configure --prefix=$HOME/packages

I won't show all the output for all of these things, but this is brief and representative

 Checking for gcc...
 Checking for shared library support...
 Building shared library libz.so.1.2.8 with gcc.
 Checking for off64_t... Yes.
 Checking for fseeko... Yes.
 Checking for strerror... Yes.
 Checking for unistd.h... Yes.
 Checking for stdarg.h... Yes.
 Checking whether to use vs[n]printf() or s[n]printf()... using vs[n]printf().
 Checking for vsnprintf() in stdio.h... Yes.
 Checking for return value of vsnprintf()... Yes.
 Checking for attribute(visibility) support... Yes.

This is the common GNU-style software. configure. make. make install. Run those:

make
make install
$ make install
 cp libz.a /home/pauljohn/packages/lib
 chmod 644 /home/pauljohn/packages/lib/libz.a
 cp libz.so.1.2.8 /home/pauljohn/packages/lib
 chmod 755 /home/pauljohn/packages/lib/libz.so.1.2.8
 cp zlib.3 /home/pauljohn/packages/share/man/man3
 chmod 644 /home/pauljohn/packages/share/man/man3/zlib.3
 cp zlib.pc /home/pauljohn/packages/lib/pkgconfig
 chmod 644 /home/pauljohn/packages/lib/pkgconfig/zlib.pc
 cp zlib.h zconf.h /home/pauljohn/packages/include
 chmod 644 /home/pauljohn/packages/include/zlib.h /home/pauljohn/packages/include/zconf.h

4. Adjust the environment so R-devel builds will find packages installed there.

 export PATH=$HOME/packages/bin:$PATH
 export LD_LIBRARY_PATH=$HOME/packages/lib:$LD_LIBRARY_PATH 
 export CFLAGS="-I$HOME/packages/include" 
 export LDFLAGS="-L$HOME/packages/lib" 
The first two are vital during the "make" phase in R-devel, the latter 2 are vital in the "configure" phase in R-devel.5. Try to build R-devel again, using new zlibI remove and remake the build directory, so that any accumulated errors are eliminated
cd ~/src
cd R-devel/
rm -rf builddir
mkdir builddir
cd builddir/
../configure --prefix=$HOME/packages/R-devel --with-cairo \
 --with-jpeglib --with-readline --with-tcltk \
 --with-blas --enable-BLAS-shlib --with-lapack --enable-R-profiling \
 '--enable-R-shlib' \
 '--enable-memory-profiling'

That succeeds, finds zlib, but configure ends with this error:

checking bzlib.h presence... yes
 checking for bzlib.h... yes
 checking if bzip2 version >= 1.0.6... no
 checking whether bzip2 support suffices... configure: 
error: bzip2 library and headers are required

6. Get new bzlib support. This one is not built with GNU auto tools,so it is a little more interesting/idiosyncratic. Would not havesolved it without help from this site:http://www.linuxfromscratch.org/lfs/view/development/chapter06/bzip2.html

## So we go get bzlib, which is part of bzip2, just like we did on zlib

cd ~/src
wget http://www.bzip.org/1.0.6/bzip2-1.0.6.tar.gz
tar xzvf bzip2-1.0.6.tar.gz
cd bzip2-1.0.6

Inspect the README. Temptation is to be careless and just run make, but that's not quite enough because we need the shared library. Then make after

make -f Makefile-libbz2_so
 make clean
 make
 make -n install PREFIX=$HOME/packages
 make install PREFIX=$HOME/packages

7. Try to build R-devel again

cd ~/src/R-devel
 rm -rf builddir/
 mkdir builddir
 cd builddir
 ../configure --prefix=$HOME/packages/R-devel '--with-cairo' \
 '--with-jpeglib' '--with-readline' '--with-tcltk' \
 '--with-blas' '--with-lapack' '--enable-R-profiling' \
 '--enable-R-shlib' \
 '--enable-memory-profiling'

configure fails with

checking whether bzip2 support suffices... no
 checking for lzma_version_number in -llzma... no
 configure: error: "liblzma library and headers are required"

8. Go get liblzma. I tried that, couldn't compile that, but Wes Mason warned me not to try to install the separate liblzma, but rather get the package known as xz.

cd ~/src
wget http://tukaani.org/xz/xz-5.2.2.tar.gz
tar xzvf xz-5.2.2.tar.gz
cd xz-5.2.2
./configure --prefix=$HOME/packages
make -j3
make install

8.  Try R-devel again, same steps as before, make builddir, then

../configure --prefix=$HOME/packages/R-devel '--with-cairo' \
 '--with-jpeglib' '--with-readline' '--with-tcltk' \
 '--with-blas' '--with-lapack' '--enable-R-profiling' \
 '--enable-R-shlib' \
 '--enable-memory-profiling'

That gets quite a bit further and fails:

checking for pcre/pcre.h... no
 checking if PCRE version >= 8.10, < 10.0 and has UTF-8 support... no checking whether PCRE support suffices... configure: error: pcre >= 8.10 library and headers are required

9. Get pcre. This is getting old now

cd ~/src

 wget
 ftp://ftp.csx.cam.ac.uk/pub/software/programming/pcre/pcre-8.38.tar.gz

 tar xzvf pcre-8.38.tar.gz
 ./configure --prefix=$HOME/packages
 make -j3
 make install

9B. Back to R-develR-devel configure fails same way:

checking for pcre/pcre.h... no
checking if PCRE version >= 8.10, < 10.0 and has UTF-8 support...
no checking whether PCRE support suffices... 
configure: error: pcre >= 8.10 library and headers are required

9C So I suspect the UTF-8 support is the issue.Back to PCRE, reconfigure. Usually, best to erase whole sourcedirectory and get a clean run at it. Because I did not do safe thingand use a builddir in there, I have to do that. I run this configure command,

./configure --enable-utf8 --prefix=$HOME/packages
make 
make install

10. Try R-devel again.Fails asking for libcurl

 checking libcurl version ... 7.19.7
 checking curl/curl.h usability... yes
 checking curl/curl.h presence... yes
 checking for curl/curl.h... yes
 checking if libcurl is version 7 and >= 7.28.0... no
 configure: error: libcurl >= 7.28.0 library and headers are 
required with support for https

11. Install libcurl## Note need ignore certificate problem on this one

 cd ~/src
 wget --no-check-certificate https://curl.haxx.se/download/curl-7.47.1.tar.gz
 tar xzvf curl-7.47.1.tar.gz
 cd curl-7.47.1
 ./configure --prefix=$HOME/packages
 make -j3
 make install

12. Try R-devel again

cd ~/src
 cd R-devel
 rm -rf builddir
 mkdir builddir
 cd builddir
 ../configure --prefix=$HOME/packages/R-devel '--with-cairo' \
 '--with-jpeglib' '--with-readline' '--with-tcltk' \
 '--with-blas' '--with-lapack' '--enable-R-profiling' \
 '--enable-R-shlib' \
 '--enable-memory-profiling'

HOORAY, it finished!

config.status: creating tests/Embedding/Makefile
 config.status: creating tests/Examples/Makefile
 config.status: creating tools/Makefile
 config.status: creating src/include/config.h
 config.status: executing libtool commands
 config.status: executing stamp-h commands
R is now configured for x86_64-pc-linux-gnu
Source directory: ..
 Installation directory: /home/pauljohn/packages/R-devel
C compiler: gcc -std=gnu99 -I/home/pauljohn/packages/include
 Fortran 77 compiler: gfortran -g -O2
C++ compiler: g++ -g -O2
 C++11 compiler: g++ -std=c++11 -g -O2
 Fortran 90/95 compiler: gfortran -g -O2
 Obj-C compiler: gcc -g -O2 -fobjc-exceptions
Interfaces supported: X11, tcltk
 External libraries: readline, BLAS(generic), LAPACK(generic), curl
 Additional capabilities: PNG, JPEG, NLS, cairo, ICU
 Options enabled: shared R library, R profiling, memory profiling
Capabilities skipped: TIFF
 Options not enabled: shared BLAS
Recommended packages: yes
configure: WARNING: you cannot build info or HTML versions of the R manuals
configure: WARNING: neither inconsolata.sty nor zi4.sty found: PDF vignettes and package manuals will not be rendered optimally

That last warning, well, I'm ignoring it. I don't need to build their documents, I need to see if my package builds without errors. I don't care much that shared BLAS is not enabled, but I ususally would want that if I were making a production system.However, running

make

ends in failure:

gcc -std=gnu99 -shared -fopenmp -L/home/pauljohn/packages/lib -o libR.so 
CommandLineArgs.o Rdynload.o Renviron.o RNG.o agrep.o apply.o 
arithmetic.o array.o attrib.o bind.o builtin.o character.o coerce.o 
colors.o complex.o connections.o context.o cum.o dcf.o datetime.o 
debug.o deparse.o devices.o dotcode.o dounzip.o dstruct.o 
duplicate.o edit.o engine.o envir.o errors.o eval.o format.o 
gevents.o gram.o gram-ex.o graphics.o grep.o identical.o 
inlined.o inspect.o internet.o iosupport.o lapack.o list.o 
localecharset.o logic.o main.o mapply.o match.o memory.o 
names.o objects.o options.o paste.o platform.o plot.o plot3d.o 
plotmath.o print.o printarray.o printvector.o printutils.o qsort.o 
radixsort.o random.o raw.o registration.o relop.o rlocale.o 
saveload.o scan.o seq.o serialize.o sort.o source.o split.o 
sprintf.o startup.o subassign.o subscript.o subset.o summary.o 
sysutils.o times.o unique.o util.o version.o g_alab_her.o 
g_cntrlify.o g_fontdb.o g_her_glyph.o xxxpr.o `ls ../unix/*.o 
../appl/*.o ../nmath/*.o` ../extra/tre/libtre.a -lblas -lgfortran 
-lm -lquadmath -lreadline -lpcre -llzma -lbz2 -lz -lrt -ldl -lm 
-licuuc -licui18n
 /usr/bin/ld: /home/pauljohn/packages/lib/libbz2.a(bzlib.o): 
relocation R_X86_64_32S against `BZ2_crc32Table' can not be used 
when making a shared object; recompile with -fPIC
 /home/pauljohn/packages/lib/libbz2.a: could not read symbols: Bad value
 collect2: error: ld returned 1 exit status
 make[3]: *** [libR.so] Error 1
 make[3]: Leaving directory `/panfs/pfs.acf.ku.edu/home/pauljohn/src/R-devel/builddir/src/main'
 make[2]: *** [R] Error 2
 make[2]: Leaving directory `/panfs/pfs.acf.ku.edu/home/pauljohn/src/R-devel/builddir/src/main'
 make[1]: *** [R] Error 1
 make[1]: Leaving directory `/panfs/pfs.acf.ku.edu/home/pauljohn/src/R-devel/builddir/src'
 make: *** [R] Error 1

13. That's certainly pointing the finger back at bzip2, which is the only non-standard library in the whole batch. It doesn't use GNU autoconf, has vague instructions. I went into the bzip2 directory and inserted -fPIC as a CFLAG in the Makefile. Then I ran make and make install PREFIX=$HOME/packages again, as above14. R-devel, againrm the builddirmake a new builddir, go in there, run the configure statement, looksOK

make

succeeds. Be aware, it is VITAL the PATH and LD_LIBRARY_PATH be set in the environment as stated above.Here's the evidence it did eventually compile

make[2]: Leaving directory `/panfs/pfs.acf.ku.edu/home/pauljohn/src/R-devel/builddir/src/library/Recommended'
make[1]: Leaving directory `/panfs/pfs.acf.ku.edu/home/pauljohn/src/R-devel/builddir/src/library/Recommended'
make[1]: Entering directory `/panfs/pfs.acf.ku.edu/home/pauljohn/src/R-devel/builddir/src/library'
building/updating vignettes for package 'grid' ...
building/updating vignettes for package 'parallel' ...
building/updating vignettes for package 'utils' ...
make[1]: Leaving directory `/panfs/pfs.acf.ku.edu/home/pauljohn/src/R-devel/builddir/src/library'
make[1]: Entering directory `/panfs/pfs.acf.ku.edu/home/pauljohn/src/R-devel/builddir'
configuring Java ...
Java interpreter : /usr/bin/java
Java version : 1.7.0_09-icedtea
Java home path : /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.9.x86_64/jre
Java compiler : /usr/bin/javac
Java headers gen.: /usr/bin/javah
Java archive tool: /usr/bin/jar
trying to compile and link a JNI program 
detected JNI cpp flags : -I$(JAVA_HOME)/../include -I$(JAVA_HOME)/../include/linux
detected JNI linker flags : -L$(JAVA_HOME)/lib/amd64/server -ljvm
make[2]: Entering directory `/library/tmp/Rjavareconf.wg86X6'
gcc -std=gnu99 -I/home/pauljohn/src/R-devel/builddir/include -I/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.9.x86_64/jre/../include -I/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.9.x86_64/jre/../include/linux -I/usr/local/include -fpic -I/home/pauljohn/packages/include -c conftest.c -o conftest.o
gcc -std=gnu99 -shared -L/home/pauljohn/src/R-devel/builddir/lib -L/home/pauljohn/packages/lib -o conftest.so conftest.o -L/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.9.x86_64/jre/lib/amd64/server -ljvm -L/home/pauljohn/src/R-devel/builddir/lib -lR
make[2]: Leaving directory `/library/tmp/Rjavareconf.wg86X6'
JAVA_HOME : /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.9.x86_64/jre
Java library path: $(JAVA_HOME)/lib/amd64/server
JNI cpp flags : -I$(JAVA_HOME)/../include -I$(JAVA_HOME)/../include/linux
JNI linker flags : -L$(JAVA_HOME)/lib/amd64/server -ljvm
Updating Java configuration in /home/pauljohn/src/R-devel/builddir
Done.
make[1]: Leaving directory `/panfs/pfs.acf.ku.edu/home/pauljohn/src/R-devel/builddir'

 

Posted in Linux | Tagged , , , | 41 Comments

New lifescape panorama

 

Today, I was struck by the beauty of the desert on a crystal clear morning. As the panorama below  indicates, there is no apparent sign of life in this seemingly barren, desolate part of the high plains. Only the grizzled veterans like me can see the glimmers of life that lie deep within the sand, waiting for the rainy season to bring precious moisture.

The library on new year's eve

The library on new year's eve

When the rainy season comes (or the semester starts), the isolated cedar and cactus trees will not look so lonely.

Posted in Uncategorized | Leave a comment

Time Series tips for R users

Two people were in here last week asking me about time series modeling in R. I said I'd look something up.

I thought this was fabulously helpful for translating between R and time series books:

http://www.stat.pitt.edu/stoffer/tsa2/Rissues.htm

The leading authority on time series, especially ARIMA, in R is probably Rob Hyndman:

http://robjhyndman.com/hyndsight/time-series-data-in-r/

A new tutorial that was announced today that reminded me I promised to look up time series materials.

http://www.analyticsvidhya.com/blog/2015/12/complete-tutorial-time-series-modeling/

Posted in R | Tagged | Leave a comment