/A/ /M/A/G/N/I/F/I/C/E/N/T/ /P/L/A/C/E/
20:58 Tuesday 29 November 2011
Time for a new release of XorCurses. This release fixes some issues with the replay menu, and provides visual feedback for scroll threshold changes. There's also a few developer-centric changes of no interest to users.
Download XorCurses-0.1.3.tar.bz2 source code.
XorCurses on jwm-art.net.
18:35 Wednesday 16 November 2011
Let me tell you a little story. One day in September, October, or November 2010 I was feeling unhappy. There were probably other days during that
time period I felt unhappy but this one is the one I'm writing about. On that day I was feeling extra sorry for myself for I had spent a lot of
time computer programming and it seemed that all my effort went unappreciated.
In my unhappiness I decided I would perform an "experiment". The experiment would be to delete my website and see if anyone cared.
The sad result was that nobody cared. Or put another way, nobody asked me why I had deleted it. In my mind that equated with everything on my
website being worthless. It still seems fairly logical, but the logic is flawed. I would say "without doubt the logic is flawed" but I'm too much
of a doubter for that.
Anyway I need to stop dwelling on that. Unfortunately I'm very much the type of person who does dwell on things. It requires effort for the bored
mind to resist the negative forces (of which there are many) attracting one to dwell upon them. Luckily though, the effort of resistance can be
transferred to the effort of problem-solving or creativity, provided that is, one has something to do which requires greater effort. The greater
effort is required in order for a continuum to be created in which effort of creativity/problem solving is continually displacing the negative
forces out of one's mind.
As I understand it though, the negative forces may well be part and parcel of one's self and as such, any effort spent only with the purpose of
displacement is a complete waste of time. Effort spent in creativity which happens to displace the negative forces, if viewed purely from the
perspective of "did it displace the negative forces or not?" is also a complete waste of time. But then if one wishes to be so narrow minded about
such endeavours, one deserves one's time to be wasted doesn't one?
07:36 Saturday 12 November 2011
Working on my website PHP code off-line out of the wild... Still.
The keywords facility (as I refer to it) has been optimized further.
"The keywords page - without any keyword refinement - currently
produces 717 links."
717 links is perhaps rather too much choice. Who's going to notice if
keywords only used once are not displayed? That's 333.But who's going
to miss those keywords which are only used twice? 230. Thrice? 190.
That's a reduction of 527 links, but a reduction which optimizes for
the user as well as the server by filtering out extraneous information.
It also means that the page generation time on:
[sirrom@scrapyard ~]$ uname -a
Linux scrapyard 3.0-ARCH #1 SMP PREEMPT Fri Oct 7 11:35:34 CEST 2011
x86_64 Intel(R) Core(TM)2 Duo CPU E8400 @ 3.00GHz GenuineIntel
is reduced further from 8s to 0.5s and now, to 0.13s.
Tell a lie. I've missed an intermediate step which got the time down to
0.13s. A call to exec which ran GNU make every time the keywords were
used. Make ran grep to create a file containing each page name and the
keywords defined for it. Because pages to jwm-art.net are not regularly
added, it was a waste of time continually call exec to run make to run
grep. Additionally, because adding pages generally also requires SSH
access, the call to run make may as well be performed manually.
The generated file looks somewhat like this:
(note:notice capitalization and white space)
So every time the keywords page was requested, processing was required
to get the page name by removal of path and extension, and then the
keywords themselves by removing everything before the equals sign.
I decided that some of this processing such as whitespace removal
and de-capitalisation could be reduced by moving it into the Makefile.
This lead me on a regular expression learning quest and to sed. Sed is
The Makefile looks somewhat like this:
KEYLIST := keysdats.list
FILES := $(wildcard ../dat/*.dat)
grep '^keywords' $^ | sed -f sed_clean_grep_output > $@
And running make using that Makefile produces the keywordpages.txt
file which now looks like this:
Which is achieved by using sed with expressions contained in the
sed_clean_grep_output file which contains:
s/keywords ?= ?//
Now, what of the keyword results? Click the 'digital' keyword and you
used to be presented with over 200 results. That's 200 results
consisting of a thumbnail entry (thumbnail-image, title, and short
description), plus 250 (IIRC) characters of text culled from the
'information' section of each page wrapped in HTML <div> element using
the 'result' class.
First results optimization, remove the long description (ie the culled
text) and thus the need for the extra <div>, plus removal of CSS for
'result' class. With the removal of the HTML/CSS alone, that's a
reduction of 4.6kb for the results of 'digital' keyword.
Second results optimization is to only display ten results at a time.
It was purely out of laziness that I hadn't attempted this until now...
That was the week before last.
06:28 Saturday 12 November 2011
"the joy of eliminating code that sucks is so great i must tell you all about it right now in case the world ends this very minute"
Working on my website PHP code off-line out of the wild. It's code which has existed for a few years now. Some of the routines might hark back to when I first started coding in PHP in 2005(?).
At some point between now and then I created a keywords facility. The first implementation was really slow. The second implementation was a vast improvement in that it didn't collate and order all the keywords twice.
As my skills in PHP advanced I wanted it to do more and consequently added bits and pieces here and there. One of thoses bits and pieces was the ability to link to the parent of a page, a random page, or the current page. Having once coded in C++, the way of linking to the current page was to link to *this, and consequently linking to a random page became *random. The easy way of doing this to get it to work aseasily as possible was to blindly use str_replace on *every single link* that was created.
The keywords page - without any keyword refinement - currently produces 717 links. That's over 2100 calls to str_replace. In Debian it doesn't seem to be an issue, but on Arch, generating the page took 8 seconds.
After a few false starts micro-optimizing link and keyword code to no avail, I saw the str_replace calls and wondered what would happen if I made them conditional on strpos finding an asterisk in the link data.
Well what do you know? That simple test reduces the keyword page build time down from 8 seconds to just under half a second. Lah-dee lah-deedah!