Skip to content →

Tag: latexrender

command line interface

Way
back in 1999 I read Neal Stephenson’s pamphlet In the Beginning ! Was the Command Line and
decided I should and would have Linux running on my clamshell iBook.
Needless to say this was (a) a foolish idea and (b) not entirely trivial
in those dark OS 9-days. Still, I somehow managed with the help op PPC Linux and was
proudly wearing their T-shirt (at least for a couple of weeks in early
2000). Fortunately, as a brief OS X
history
recalls, OS X was released March 24, 2001 and put an end to
my Linux-folly and I’m pretty certain even Neal Stephenson is on Mac OSX
these days.

Needless to say I couldn’t resist installing the
Wordpress CLI-theme
the moment I spotted it! A command line
interface to your blog! awesome! If you want to have a go at the
original version, take a look at Rod McFarland’s blog.
Just type ‘ls’ to the prompt and you’ll be hooked. Or you can have a
look at the command line interface of NeverEndingBooks by going to the
left sidebar and clicking CLI under the ‘Command Line Version’ header
(don’t be afraid you can always come back by clicking on the
GUI-interface over there). My design is black on a light-gray background
and is no where near as cool as the original theme but it was the only
quick way around some limitations of the CLI-theme.

The
CLI-theme operates as a front-end via a small interpreter which draws
the information directly from the WordPress-database. As a result you
loose the effect of all post-processing by plugins such as Markdown and LatexRender two of
the plugins I use most! I could still live with the idea that pure LaTeX
was served to a CLI-environment between tex-tags, but surely I didn’t
want to loose all my links! The quick (and extremely dirty) way around
it was to resubmit the relevant part of the HTML-source files of the
GUI-frontend posts to the WP-database. And to serve the same LaTeX-gifs
to the GUI and CLI interface I needed the backgound to be rather light
gray (taking #BDBDBD gray would have been much nicer wrt. the cool
rasterized grayed-images but then some of the more recent LaTeX-gifs
became partially unreadable). Oh, and in the process I had to update the
permalink structure, thereby wrecking allmost all internal
reference-links (but I’ll sort them out soon, I promise).

So, a
lot of work for a rather meagre result. What do I like about the
CLI-interface (apart from old time nostalgia)? I really like the
searching facility. Just type ‘search yourword’ to the prompt and it
will give you all posts containing that word (much quicker than in the
GUI-interface) and if you remember at least one word from a post-title,
feeding it to the prompt will give you the entire post (or a list of
posts if the same word appears in different posts). Try out typing
‘Perelman’ to see what I mean. Besides, bots don’t seem to know what to
do with the CLI-interface so for the few days I had this theme as my
default theme I was alone on NeverEndingBooks mast of the time (which
helped a lot having to change that many posts). So, whenever I want to
have the site to myself I’ll just change the default theme from now
on.

Still, I did put back the old GUI as default because the
CLI-theme still has a few drawbacks. Such as, it is impossible to write
a sizable comment (not that too many of you do this, but anyway) and
some other quirks. Still Rod McFarland is working on a version 2 (and
even set up a google-group for
those who want to code along, and maybe I’ll join the effort) which
promises a great improvement and I’m rather confident that by version
3.14 it will be in a state that I’ll have the CLI-interface as my
default. Until then, I’ll keep up the two front-ends and allow you to
toggle as you like (your browser will remember your preference).

I realize most of you are youngsters and not of my cpu2
generation so have a hard time imagining how exiting a command line
prompt is. Fortunately, Neal Stephenson has made the full text of “In
the beginning ! was the command line” available as a
free download. Print it out and enjoy!

Leave a Comment

Latexrender and dvonn boards

In order
to blog a bit about Dvonn-strategy, I made myself a simple Dvonn
LaTeX-template which works very well on paper but which gets mutilated
by Latexrender, for example the first situation of the looks
like

$~\xymatrix@=.3cm @!C @R=.7cm{ & & \Black{2} \connS & &
\bull{d}{5} \conn & & \bull{e}{5} \conn & & \bull{f}{5} \conn & &
\bull{g}{5} \conn & & \bull{h}{5} \conn & & \SWhite \connS & & \SWhite
\connS & & \SWhite \conneS & & \\ & \bull{b}{4} \conn & & \SBlack
\connS & & \Black{6} \connS & & \bull{e}{4} \conn& & \bull{f}{4} \conn &
& \bull{g}{4} \conn & & \bull{h}{4} \conn & & \SWhite \connS & &
\SWhite \connS & & \SWhite \conneS & \\ \SBlack \connbeginS & &
\SBlack \connS & & \BDvonn{7} \connS & & \bull{d}{3} \conn & & \SBlack
\connS & & \BDvonn{6} \connS & & \bull{g}{3} \conn & & \bull{h}{3}
\conn & & \Dvonn \connS & & \SWhite \connS & & \SWhite \connendS \\ &
\Black{5} \connbeginS & & \bull{b}{2} \conn & & \SBlack \connS & &
\bull{d}{2} \conn & & \bull{e}{2} \conn & & \bull{f}{2} \conn & &
\bull{g}{2} \conn & & \bull{h}{2} \conn & & \SWhite \connS & & \SWhite
\connendS & \\ & & \bull{a}{1} \con & & \bull{b}{1} \con & & \Black{5}
\conS & & \bull{d}{1} \con & & \bull{e}{1} \con & & \bull{f}{1} \con & &
\bull{g}{1} \con & & \bull{h}{1} \con & & \White{2} & &} $

The
reason behind this unwanted clipping is that Latexrender uses
**convert** to take the relevant part of a ps-page containing only the
TeXed formula on an empty page by performing clipping and then converts
it into a GIF-file (or any other format you desire). The obvious way
round this is to enlarge my template by adding two additional rows and
columns and putting visible nonsense there (such as dots) to enlarge the
relevant part so that no clipping is done of essential info. But then
(1) the picture generated becomes even larger than that above and (2) I
don’t want you to see the extra nonsensical dots… The essential line
in the **class.latexrender.php** file is

$command =
$this->_convert_path." -density ".$this->_formula_density.
" -trim -transparent \"#FFFFFF\" ".$this->_tmp_filename.".ps ".
$this->_tmp_filename.".".$this->_image_format;

So
I needed to delve into the [manual pages for the convert command](http://amath.colorado.edu/computing/software/man/convert.html)
of the ImageMagick-package. To my surprise, the *-trim* option (which I
thought to adjust somewhat by adding parameters) doesn’t exist! Still, I
got around my second problem using the *crop* option and around the
first by using the very useful *geometry* option. The latter is also
useful if you find that the size of the output of Latexrender is not
compatible with the size of your regular text. Of course you can amend
this somewhat by using the *extarticle* documentclass (as suggested) but
if you want to further adjust it, use for example

-geometry
86%

to size the output to exactly 86% (or whatever you need).
So, whenever I want to do some Dvonn-blogging from now on I’ll change my
class.latexrender.php file as follows

$command =
$this->_convert_path." -crop 0x0-10% -crop 0x0+10% -density
".$this->_formula_density. " -geometry 80%
-transparent \"#FFFFFF\" ".$this->_tmp_filename.".ps ".
$this->_tmp_filename.".".$this->_image_format;

which
produces the output

$\xymatrix@=.3cm @R=.7cm{.& & & & & & & & & &
& & & \\ & & & \Black{2} \connS & & \bull{d}{5} \conn & & \bull{e}{5}
\conn & & \bull{f}{5} \conn & & \bull{g}{5} \conn & & \bull{h}{5} \conn
& & \SWhite \connS & & \SWhite \connS & & \SWhite \conneS & & & \\ & &
\bull{b}{4} \conn & & \SBlack \connS & & \Black{6} \connS & &
\bull{e}{4} \conn& & \bull{f}{4} \conn & & \bull{g}{4} \conn & &
\bull{h}{4} \conn & & \SWhite \connS & & \SWhite \connS & & \SWhite
\conneS & & \\ & \SBlack \connbeginS & & \SBlack \connS & &
\BDvonn{7} \connS & & \bull{d}{3} \conn & & \SBlack \connS & &
\BDvonn{6} \connS & & \bull{g}{3} \conn & & \bull{h}{3} \conn & &
\Dvonn \connS & & \SWhite \connS & & \SWhite \connendS & . \\ & &
\Black{5} \connbeginS & & \bull{b}{2} \conn & & \SBlack \connS & &
\bull{d}{2} \conn & & \bull{e}{2} \conn & & \bull{f}{2} \conn & &
\bull{g}{2} \conn & & \bull{h}{2} \conn & & \SWhite \connS & & \SWhite
\connendS & & \\ & & & \bull{a}{1} \con & & \bull{b}{1} \con & &
\Black{5} \conS & & \bull{d}{1} \con & & \bull{e}{1} \con & &
\bull{f}{1} \con & & \bull{g}{1} \con & & \bull{h}{1} \con & & \White{2}
& & & \\ . & & & & & & & & & & & & & } $

which (I hope) you will
find slightly better…

Leave a Comment

latexrender plugin for wordpress under tiger

Promises and pie-crusts are made to be broken, a wiser man once
said. Still, promises have a much longer life-span and sometimes their
real content becomes redundant over time.

A year ago, I
promised
to document how I got the
LaTeXRender Plugin for WordPress
working under OS X. The procedure
consisted of some trial-and-error operations, installing non-standard
versions of software and hardcoding certain directories throughout
certain files…

Not something I was looking forward
to when I decided to upgrade this WordPress blog but,
surprisingly, things went pretty smoothly this time (Mac-technology
has improved a lot). So, please don’t worry too much about this
post
and follow the (late) instructions below.

First
things first : I will assume you have the ‘generic’ LaTeX
running under Tiger (10.4),that is, use the i-Installer to download BOTH
LaTeX and Imagemagick! Further, in order to get WordPress up and
running, have the standard
MySQL 4.0 package
installed for 10.3 (not version
4.1…) and don’t use the generic Mac-PHP version, but
instead download Marc
Liyanage’s PHP5 package
which has plenty of additional
packages installed (notably, GDlib and MCRYPT which comes in handy if
you want to fight spam-comments using BotCheck).

\r
\n

Download wp-
latexrender.zip
and follow the instructions given to the letter
(there is one undocumented extra directory you have to fill in at the
start of the latexrender-plugin.php file). There is
just one additional thing to do. Find in the
class.latexrender.php file the line starting
with

// convert dvi file to postscript using
  dvips

and include the following lines just before it
:

// begin of workaround // extending the PATH
  environmental variable Soldpath =
  getenv(“PATH”); Swhere_imagemagick_is =
  “/usr/local/bin”; if (Soldpath) { Swhere_imagemagick_is .=
  “:Soldpath”;} putenv(“PATH=Swhere_imagemagick_is”); //
  end of workaround 

activate the plugin and it
should work! Still, there are three things you may want to change. In
the latex.php file uncomment the indicated lines as
you will be using htmlArea to input your posts. In addition, if you
have the MarkDown-plugin enabled, it is best to append additional
lines such as

 Slatex_formula =
  str_replace(“_”,”_”,Slatex_formula);     Slatex_formula
  =
  str_replace(“_”,”_”,Slatex_formula); 

(
between the first ” ” should be the beginning and end
em-tag respectively) or underscores will be interpreted as em-tags.
If you run into additional similar problems, the procedure is to
comment-out the line

 
  unlink(Sthis->_tmp_dir.”/”.Sthis->_tmp_filename.”.tex”); 
  

near the end of class.latexrender.php , look in the
tmp directory for the TeX-file, detect the problem and add similar
lines to the ones above to solve it. Another useful thing to do
is to add TeX-packages in the class.latexrender.php file. My own
version has the following predefined symbols and loaded
packages

 function wrap_formula(Slatex_formula) { 
  Sstring  =
  “\\documentclass[“.Sthis->_font_size.”pt]{“.Sthis->_latexclass
  .”}\\n”;  Sstring .=
  “\\usepackage[latin1]{inputenc}\\n”;  Sstring .=
  “\\usepackage{amsmath}\\n”;  Sstring .=
  “\\usepackage{amsfonts}\\n”;  Sstring .=
  “\\usepackage{amssymb}\\n”;  Sstring .=
  “\\usepackage{xy}\\n”;  Sstring .=
  “\\xyoption{all}\\n”;  Sstring .=
  “\\\\newcommand{\\vtx}[1]{*+[o][F-]{Scriptscriptstyle
  #1}}\\n”;  Sstring .= “\\\\newcommand{\\mathbb{C}c}{\\Bbbk}\\n”; 
  Sstring .= “\\\\newcommand{\\mathbb{C}}{\\mathbb{C}}\\n”;  Sstring .=
  “\\\\newcommand{\\mathbb{Q}}{\\mathbb{Q}}\\n”;  Sstring .=
  “\\\\newcommand{\\mathbb{Z}}{\\mathbb{Z}}\\n”;  Sstring .=
  “\\\\newcommand{\\mathbb{N}}{\\mathbb{N}}\\n”;  Sstring .=
  “\\\\newcommand{\\mathbf}[1]{{\\\\text{\\em \\usefont{OT1}{cmtt}{m}{n}
  #1}}}\\n”;  Sstring .= “\\pagestyle{empty}\\n”;  Sstring
  .= “\\begin{document}\\n”;  Sstring .=
  “S”.Slatex_formula.”S\\n”;  Sstring .=
  “\\end{document}\\n”;          return Sstring;     }  
  

which, among other things, allow all commenters to add
quiver-pictures using xymatrix and vtx to depict vertices. Oh yes, you
can allow comments to include LaTeX-code by uncommenting the
line

  // add_filter(‘comment_text’,
  ‘addlatex’); 

in the latexrender-plugin.php
file (but before you do make sure you have spam under control, such as
with BotCheck mentioned above). That’s all for now. If you want
to use TeX in a comment, make sure to put the code between tags [ tex
] and [ /tex ] (omitting the extra spaces). If you want me to add
other LaTeX-packages, leave a comment.

Leave a Comment

markLaTeXdown

Clearly,
an extended version of Markdown
including LaTeX-commands would be useful for mathematicians and surely
I’m not the first to think about this. In fact, I found a somewhat
pompous text New adventures
if hifi text
by someone claiming to have done precisely that (though
he doesn’t give much details nor post a version of his altered program).

Still, it is pretty clear how to convert a _Markdown+LaTeX_
textfile to plain LaTeX (at least for regex-lovers
). Modify the _Markdown.pl_ script so that the Markdown markup is
translated not to HTML-tags but to LaTeX-commands.

More
interesting material can be found in a thread on _Markdown and
Mathematics_ starting with this post. In it, they search for a good way to include
LaTeX-mathematical commands in a MarkDown text. In fact, this is part of
a more general quest for a good _escape character_ in Markdown to
create _Markdown plus something_ versions. They opt for
{{ and }} rather than the usual
$ signs.

I think the alternatives [
tex ]
and [ /tex ] are slightly better because
then you could feed the text to a functional WordPress installation with the
LaTeXRender
plugin installed and copy the relevant part from the HTML-source of
the resulting post to get a HTML-version of the mathematical text with
all LaTeX-code converted to pictures. Clearly, typing the suggested tags
is somewhat cumbersome so I would type them using the
{{ and }} proposal (one
{ is not enough because a lot a LaTeX code uses single
curly brackets) and then do a global replace to get the
LaTeXRender-tags.

Even more interesting would be to have a
version of the html2txt.py script for LaTeX, that is,
converting a LaTeX-file to Markdown + LaTeXcode which would give an easy
way to convert your existing papers to HTML if you feed the LaTeXRender
plugin with all the required newcommands and packages.

Leave a Comment

tiger days 1

It
should be really day 2 but yesterday evening I was a bit overoptimistic
and tried to get MySQL, Ruby, Rails & Tracks installed and in the
process totally wrecked my Ruby-system (and probably a few things more).
Besides, I found out that the _Carbon Copy Cloner_ work-around
doesn\’t really work (that is, one canNOT boot from the cloned copy)
etc. etc. In short, a lot of frustration. So today, I started all over
again (using the install notes below to guide me and so I could reduce
the total time to about 2 hrs). But, as this was the easy bit (still to
come : MySQL, PHP, WordPress+LatexRender, Ruby&Tracks etc.) and I
don\’t want to redo everything again when I do something horribly wrong
I changed my overall tactics. I\’ll keep identical copies on my iBook
and on my iMac and do the next batch of installs on just one machine and
check whether everything works before syncing it to the other. If
something gets messed up I resync to the state of the previous day. Just
one question left : what program to use for the backup/restore now that
CCC seems to be broken? Fortunately, there is still PsyncX which still
seems to work fine (at least today…). Below, for what it is worth,
yesterday\’s log of events :

Okay, I checked that I can still
TeX papers and connect to the printer on the iMac (after Archive/Install
to Tiger). Most other things have broken down, such as my mind on tracks
and my MySQL-database, but I\’m quite hopeful I can rebuild them all.
So, time for a drastic _Erase/Install_ on my iBook.

12:04 : One final safety check. Connect the external
HD, select the _Carbon Copy Cloned_ partition as StartUp Disk and
do a Restart to verify that it can be cloned back should everything go
terribly wrong. Seems to work nicely, so change again from StartUp disk,
restart and disconnect the external HD.

12:16
: Printed the macdevcenter install
tips
and made a fresh pot of coffee. Took the unread part of the
newspaper with me, connected Jan\’s iPod, made it the new StartUp disk
and did another Restart.

12:24 : Selected
\’English\’ as the main language. Selected _DiskUtility_ from the
_Utilities_ menu (before you have to select a Disk destination).
Selected the HD, clicked _Erase_ and choose _Erase Free
Space_ first, then choose the SecurityOption to \’zero out data\’.
(Both steps require a lot of extra time but what is the point of doing
an Erase if you don\’t erase properly? Btw. the macdev-article does not
agree with me on this point.) Meanwhile, had some coffee and a
read…

13:23 : Did quit DiskUtility
which brought me back to the Installer. Selected the HD and clicked on
_Options_ to select Erase&Install and clicked Continue. Then
clicked on _Custom Install_ to choose which Packages to Install.
Did choose _all_ Printer Drivers but in _Language
Translations_ only selected : French, German and Dutch. Didn\’t
select X11! Clicked : _Install_ and had yet another cup of
coffee…

13:45 : Restarted! Got me into
the SetupAssistant. Didn\’t choose to transfer info from another Mac. It
selected our wireless network immediately, and asked me for my .Mac
account info. Did create my main account and finished at
13:53 Only had to stop iTunes from wanting to put
PodSoftware onto the connected iPod… Checked for SoftwareUpdate
but there was none. Am connected to internet but had to add my other
mail-account. Done and received email at 14:05 Found
our Printer but did gray out two-sided printing (have to remember later
how I did set this up…).

14:12 : Time
to add the _Xcode Tools_ : opened the folder on the iPod and
clicked on _XcodeTools.mpkg_ . Followed he default installation.
Finished and deconnected the iPod at 14:24 Took a break
to decide how to continue. (21.97Gb available) Update today : do a
custom install using also cross-development!

14:37 : Okay, first things first : get myself a
working TeX-system starting from this page
to get the latest version of TeXShop and the i-Installer and place both
in the Applications folder and in the Dock. Placed the _To Your
Library_ folder of TeXShop in my ~/Library (containing the texmf
etc. path for pdfsync). Then followed this
page
and the i-Installer to install the packages in the right order
:

  • FreeType 2
  • libwmf
  • Ghostscript
    8
  • ImageMagick
  • FontForge
  • TeX (did a
    Full install with 2005 Devel.)

Had a brief look
through the other packages and maybe I\’ll install _Latex to RTF_
and _RTF 2 Latex_ later. Created a _DMG_ folder and put
the downloaded disk images into it. Created a_PAPERS_ folder and
transferred the last version of the paper with Stijn to check TeX but
clearly it couldn\’t find the _diagrams.sty_ file (I know I have
to quit using this, but I\’ll better get it over for backward
compatibility; put it into ~/Library/texmf/tex/latex/. Ran TeX again
without problems this time and checked the nice source-PDF syncing
(apple-click to jump). Finished : 15:37

15:56 : As long as administration sends me
_Word_ documents and expects me to read them, I have no choice
but to install _Office X_ . The upshot was that while searching
for the OfficeCD I found also the HP LaserJet 1320 CD and installed the
driver so now I can print 2-sided (using Printer Setup Utility) . Done :
16:15

16:45 : Used the
_.mac System Preference_ to get syncing started with my iDisk to
get adresses, calendars and passwords etc. on my iBook. Also filled in
the Sharing Preferences. Now that I have the passwords at hand, it is
time to get the latest versions of some of the shareware I own (and copy
their disk image to the DMG folder)

  • DevonThink
  • DenonAgent
  • Pod2Go : the site seems to be down at the
    moment but fortunately, I have a disk image of it which will have to do
    for now (note to self : check later whether the site is permanently
    dead…) Update today : it is up and running again…

and while I\’m at it I may as well get my wallet out and
purchase the full version of _Lite_ versions I like and use a lot
:

Fortunately, there is also a lot of excellent freeware that I
want to use

One of the following days : MySQL, PHP and perhaps Tracks but
first I desperately need to do some maths to kick off from all this
nonsense…

Leave a Comment

changes

Tomorrow
I’ll give my last class of the semester (year?) so it is about time to
think about things to do (such as preparing the courses for the
“master program on noncommutative geometry”) and changes to make to
this weblog (now that it passed the 25000 mark it is time for something
different). In the sidebar I’ve added a little poll to let you guess
what changes 2005 will bring to this blog (if I find the time over
Christmas to implement it). In short, @matrix will
become the portal of a little company I’ll start up (seems
_the_ thing to do now). Here are some possible names/goals. Which
one will it be? Vote and find out after Christmas.

WebMathNess is a Web-service company helping lazy
mathematicians to set up their website and make it LaTeXRender savvy
(free restyling every 6 months).

iHomeEntertaining is a
Tech-company helping Mac-families to get most out of their valuable
computers focussing on Audio-Photo-Video streaming along their Airport-network.

SnortGipfGames is a Game-company focussing on the
mathematical side of the Gipf project
games
by distributing Snort-versions of them.

NeverendingBooks is a Publishing-company specializing
in neverending mathematical course- and book-projects offering their
hopeless authors print on demand and eprint services.

QuiverMerch is a Merchandising-company specializing in
quivers. For example, T-shirts with the tame quiver classification,
Calogero-Moser coffee mugs, Lego-boxes to construct local quivers
etc.

Leave a Comment

Jacobian update

One way to increase the blogshare-value of this site might be to
give readers more of what they want. In fact, there is an excellent
guide for those who really want to increase traffic on their site
called 26
Steps to 15k a Day
. A somewhat sobering suggestion is rule S :

“Think about what people want. They
aren't coming to your site to view “your content”,
they are coming to your site looking for “their
content”.”

But how do we know what
people want? Well, by paying attention to Google-referrals according
to rule U :

“The search engines will
tell you exactly what they want to be fed – listen closely, there is
gold in referral logs, it's just a matter of panning for
it.”

And what do these Google-referrals
show over the last couple of days? Well, here are the top recent
key-words given to Google to get here :

13 :
carolyn dean jacobian conjecture
11 : carolyn dean jacobian

9 : brauer severi varieties
7 : latexrender

7 : brauer severi
7 : spinor bundles
7 : ingalls
azumaya
6 : [Unparseable or potentially dangerous latex
formula Error 6 ]
6 : jacobian conjecture carolyn dean

See a pattern? People love to hear right now about
the solution of the Jacobian conjecture in the plane by Carolyn Dean.
Fortunately, there are a couple of things more I can say about this
and it may take a while before you know why there is a photo of Tracy
Chapman next to this post…

First, it seems I only got
part of the Melvin Hochster
email
. Here is the final part I was unaware of (thanks to not even wrong)

Earlier papers established the following: if
there is
a counterexample, the leading forms of $f$ and $g$
may
be assumed to have the form $(x^a y^b)^J$ and $(x^a
y^b)^K$,
where $a$ and $b$ are relatively prime and neither
$J$
nor $K$ divides the other (Abhyankar, 1977). It is known
that
$a$ and $b$ cannot both be $1$ (Lang, 1991) and that one
may
assume that $C[f,g]$ does not contain a degree one
polynomial
in $x, y$ (Formanek, 1994).

Let $D_x$ and $D_y$ indicate partial differentiation with respect

to $x$ and $y$, respectively. A difficult result of Bass (1989)

asserts that if $D$ is a non-zero operator that is a polynomial

over $C$ in $x D_x$ and $y D_y$, $G$ is in $C[x,y]$ and $D(G)$

is in $C[f,g]$, then $G$ is in $C[f,g]$.

The proof
proceeds by starting with $f$ and $g$ that give
a
counterexample, and recursively constructing sequences of
elements and derivations with remarkable, intricate and
surprising relationships. Ultimately, a contradiction is
obtained by studying a sequence of positive integers associated
with the degrees of the elements constructed. One delicate
argument shows that the sequence is bounded. Another delicate
argument shows that it is not. Assuming the results described
above, the proof, while complicated, is remarkably self-contained
and can be understood with minimal background in algebra.

  • Mel Hochster

Speaking about the Jacobian
conjecture-post at not even wrong and
the discussion in the comments to it : there were a few instances I
really wanted to join in but I'll do it here. To begin, I was a
bit surprised of the implicit attack in the post

Dean hasn't published any papers in almost 15 years and is
nominally a lecturer in mathematics education at Michigan.

But this was immediately addressed and retracted in
the comments :

Just curious. What exactly did
you mean by “nominally a lecturer”?
Posted by mm
at November 10, 2004 10:54 PM

I don't know
anything about Carolyn Dean personally, just that one place on the
Michigan web-site refers to her as a “lecturer”, another
as a “visiting lecturer”. As I'm quite well aware from
personal experience, these kinds of titles can refer to all sorts of
different kinds of actual positions. So the title doesn't tell you
much, which is what I was awkwardly expressing.
Posted by Peter
at November 10, 2004 11:05 PM

Well, I know a few things
about Carolyn Dean personally, the most relevant being that she is a
very careful mathematician. I met her a while back (fall of 1985) at
UCSD where she was finishing (or had finished) her Ph.D. If Lance
Small's description of me would have been more reassuring, we
might even have ended up sharing an apartment (quod non). Instead I
ended up with Claudio
Procesi
… Anyway, it was a very enjoyable month with a group
of young starting mathematicians and I fondly remember some
dinner-parties we organized. The last news I heard about Carolyn was
10 to 15 years ago in Oberwolfach when it was rumoured that she had
solved the Jacobian conjecture in the plane… As far as I recall,
the method sketched by Hochster in his email was also the one back
then. Unfortunately, at the time she still didn't have all pieces
in place and a gap was found (was it by Toby Stafford? or was it
Hochster?, I forgot). Anyway, she promptly acknowledged that there was
a gap.
At the time I was dubious about the approach (mostly
because I was secretly trying to solve it myself) but today my gut
feeling is that she really did solve it. In recent years there have
been significant advances in polynomial automorphisms (in particular
the tame-wild problem) and in the study of the Hilbert scheme of
points in the plane (which I always thought might lead to a proof) so
perhaps some of these recent results did give Carolyn clues to finish
off her old approach? I haven't seen one letter of the proof so
I'm merely speculating here. Anyway, Hochster's assurance that
the proof is correct is good enough for me right now.
Another
discussion in the NotEvenWrong-comments was on the issue that several
old problems were recently solved by people who devoted themselves for
several years solely to that problem and didn't join the parade of
dedicated follower of fashion-mathematicians.

It is remarkable that the last decade has seen great progress in
math (Wiles proving Fermat's Last Theorem, Perelman proving the
Poincare Conjecture, now Dean the Jacobian Conjecture), all achieved
by people willing to spend 7 years or more focusing on a single
problem. That's not the way academic research is generally
structured, if you want grants, etc. you should be working on much
shorter term projects. It's also remarkable that two out of three
of these people didn't have a regular tenured position.

I think particle theory should learn from this. If
some of the smarter people in the field would actually spend 7 years
concentrating on one problem, the field might actually go somewhere
instead of being dead in the water
Posted by Peter at November
13, 2004 08:56 AM

Here we come close to a major problem of
today's mathematics. I have the feeling that far too few
mathematicians dedicate themselves to problems in which they have a
personal interest, independent of what the rest of the world might
think about these problems. Far too many resort to doing trendy,
technical mathematics merely because it is approved by so called
'better' mathematicians. Mind you, I admit that I did fall in
that trap myself several times but lately I feel quite relieved to be
doing just the things I like to do no matter what the rest may think
about it. Here is a little bit of advice to some colleagues : get
yourself an iPod and take
some time to listen to songs like this one :

Don't be tempted by the shiny apple
Don't you eat
of a bitter fruit
Hunger only for a taste of justice

Hunger only for a world of truth
'Cause all that you have
is your soul

from Tracy Chapman's All
that you have is your soul

Leave a Comment

quiver pictures in wordpress

Having

latexrender
available, one can edit the _class.latexrender.php_ file
to include additional LaTeX-packages. For example adding the lines

 
string .= '\usepackage{xy}\n';  
string .= '\xyoption{all}\n';  
string .= '\newcommand{\vtx}[1]{*+[o][F-]{\scriptscriptstyle #1}}\n';

makes it possible to include quiver-pictures in this weblog.
Observe the double blackslash before newcommand, a single backslash
would produce a new-line and fail to define something.

Leave a Comment

LatexRender plugin for wordpress under Panther

After
three days of desperate trial-and-error I seem to have managed to get latexrender working for
wordpress under Mac
OS X.
First things first : if you only want to include some
symbols in your blog-posts the easiest way to do so is to use mimetex and the
corresponding
wordpress-plugin
written by Steve Mayer. Follow the
instructions and you will be able to include a limited subset of LaTeX
in your blog within 10 minutes.
If you want more, you have to
work a lot harder. The starting point is to follow Steve’s
blog-entries on latexrender
.
But then under Mac OS X you will probably get error messages
when you activate the plugin. The reason seems to be that most versions
of imagemagick available for
OS X require X-terminal support and PHP gets confused between the two
shells. A typical error message is

Warning:
copy(70afbabac176169545d01f4bd91f3055.gif): failed to open

stream:
No such file or directory in
/Users/lieven/Sites/wordpress/latexrender/class.latexrender.php on
line

269

[Unparseable or potentially dangerous latex
formula. Error 6 ]

As suggested by Steve Mayer there are
two roads to obtain more information on what goes wrong. The first is to
uncomment the _unlink commands _ at the end of the
_class.latexrender.php_ file and look in the _wordpress/latexrender/tmp_
directory for which conversions were done and which failed. The normal
latexrender-procedure is : tex->dvi->ps->gif. Probably you will
get all files but the gifs!

Another (and more useful) source of
informations is to look in the _error-log_ of the Apache-WebServer and
see whether you get things like

This is dvips(k) 5.94a
Copyright 2003 Radical Eye Software (www.radicaleye.com)
\\’
TeX output 2004.08.30:1433\\’ ->
0d48700a5dde6d746813733d26dd8df8.ps

. [1]
sh: line 1:
gs: command not found
convert: no decode delegate for this image
format

/Users/lieven/Sites/weblog/latexrender/tmp/
0d48700a5dde6d746813733d26dd8df8.ps\\’.

convert: missing an image
filename/Users/lieven/Sites/weblog/latexrender/tmp/
0d48700a5dde6d746813733d26dd8df8.gif\\’.

identify: unable to
open image 0d48700a5dde6d746813733d26dd8df8.gif\\': No such file
or directory.
identify: missing an image
filename
0d48700a5dde6d746813733d26dd8df8.gif\\’.

`

Here the essential point is that the webserver doesn’t
seem to be able to find GhostScript (even if you have several versions
installed).

To bypass these problems I did two essential
things : (1) in the _class.latexrender.php_ file I rewrote the
conversions so as to use _pdflatex_ instead of tex (to get
immediately a pdf-file rather than the tex->dvi->ps process) and then
use _convert_ to translate this pdf-file into a gif-file. (2) the
version of _convert_ and _include_ (both part of the
ImageMagick package) are those provided by Fink but you should be extremely
careful to install the imagemagick-nox package and not
the imagemagick package! After the command
sudo fink
install imagemagick-nox

you are presented with several
configuration choices. Do _not_ choose on auto-pilot the default
choices but look for options specifying that there is no X-support!
After this, everything should work. If you want to have a look at how
I changed the PHP files, mail
me
.

Leave a Comment