GSAK (Geocaching Swiss Army Knife)
Complete rewrite of the macro engine.
Firstly, I would like to point out the limitations of the existing expression engine. There are two (in my view) very big issue with the existing engine:
1. String data variables are limited to 16k. For example, if you try to read the cache long description into a variable you will only get the first 16k. Many cache long descriptions exceed this size and it means your are missing valuable information. There is also other times when you need to have string variables greater than this size.
2. The existing engine will not allow the use of " (double quotes) inside a variable. I have used a horrible "kludge" of swapping chr(34) to chr(255) to get around this but it still only works on a few functions. Currently, many other functions just strip out double quotes so they won't bomb out the macro. You may have also noticed that all $d_ (database variables) strip out double quotes and this also means you can't update the database in any field using a double quote.
There are other limitations that are not so significant but would have caused problems later down the track:
1. Function definitions are limited to 3 parameters
2. Engine only supports a maximum of 64 functions
3. Engine does not support the DateTime variable (only date) not a problem now but this could become a real issue later on
4. Function names are limited to 8 characters (I used another "kludge" here to get around this one)
5. Expression errors just say "error" - no indication of the position in the code where the error is.
6. Relatively slow as all expression are run "interpretively"
The new expression engine I have implemented removes all these limitations, thus:
1. String variables can be up to 2GB
2. Any character (including double quotes) can be placed inside a variable
3. Unlimited number of user defined functions
4. Unlimited number of parameters for each function
5. Function names can be up to 256 characters long
6. DateTime variables supported
7. Expression errors reveal the exact position in the code where the error is.
8. Expressions are "pseudo compiled" so they run about 5 times quicker than interpreted ones
Note: The effect of item 8 is somewhat misleading because to make the language backwards compatible there is a certain amount of "pre processing" to be done on each expression. However, testing has shown that even with this pre processing it is still faster than the old engine.
I have tried to make the new macro engine as backwards compatible as I could but there a few items that I just couldn't convert over so existing macros will break it you have used these features/quirks.
I will try to explain in detail so you know what to look out for if you get an error message when running a macro.
1. Variable type can not change.
When a variable is first created it is allocated a type (string, numeric, date,boolean). Once this type is allocated it can't be changed. This is probably the single biggest change that will break some macros out there. For example:
$x = 45
$x = "found"
This code would work fine under the old engine but you will now get an error in the new one. The fix is to just allocate each data type to a separate variable and then use that variable when required:
$x = 45
$y = "found"
2. When negating a logical expression (using not or .not. ) you must surround the part you are negating with parenthesis.
The help file actually does state this for the old engine too, but unfortunately it does not enforce it. So that this problem won't break most macros I have made the preprocessor pick up the common usage errors and fix them, but some of the more complicated ones you will get an error. For example the code:
While .not. $_EOL
The preprocessor will automatically convert to:
While not ($_EOL)
You can save a few CPU cycles by coding it this way in the first place, but at least existing macros won't choke on this widely used scenario.
3. Numeric literals must always have something before the decimal place.
For example, the following code would work in the old macro engine but will throw an error in the new one:
$x = .4567
You must now write this as:
$x = 0.4567
Variable substitution -The new raw engine parser does not support variable substitution but with a lot of smoke and mirrors in the background (much of it using the preprocessor) the new engine does actually still support it. A lot of the conversion time was spent in keeping this feature, but I realised many users have come to love it and it would make fixing up some macros quite "ugly" with out it. However, there is a cost - speed. Without variable substitution the new engine is up to 5 times faster when processing raw code (for example a fixed loop of 1000 iterations that does not traverse the database).
For the most part, if you only use variable substitution a little, then your macro will run slightly quicker than the old version. If you use it extensively, it will run slower. This is not really a problem in most macros but if you are looping through a large database and running a lot of code, it could be significant.
Note: The speed difference is not really affected by the VSUB Status=On/Off setting. It is only when you actually write code that takes advantage of variable substitution that the code is slower.
The irony is that the new engine actually has better support for variable substitution as it can now be used with database variables.
For example the following code would be invalid in the old engine because it contains a database variable.
$msg = "Number of logs for this cache is: $d_NumberOfLogs"
Under the old engine you probably would have done it like this:
$logs = $d_NumberOfLogs
$msg = "Number of logs for this cache is: $logs"
However, the most efficient code processor wise (and works in both engines) is:
$msg = "Number of logs for this cache is: " + NumToStr($d_NumberOfLogs)
When writing logical expressions the new parser uses or and and rather than .or. and .and. for logical concatenation. However, the preprocessor will correct this if you forget.
Changed arc filter waypoints from memo edit control to richedit control
The current memo control (in the arc filter) only seems to allow for around 1500 waypoints when using Windows 98. By changing this to a rich text control (visually there is no difference) I manged to load over 10,0000 waypoints without any problems on my Win98 box (this does not seem to be an issue/limitation on Windows 2000/XP)
Added owner name to the database
I am surprised it took so long to be taken to task on this one. When GSAK was first set up I got a little cute by deciding that "placed by" was the same as "owner name" (in the GPX file I was interrogating at the time they were all the same). This saved space in the database and where "owner name" was required in the GPX export I just substituted the "placed by". Groundspeak allow you to have a "Placed by" and an "owner". Though they are usually the same, there are in fact many instances when they can be different. I have now added this field to the database.
When you open a database that doesn't have this field, GSAK will do the "upgrade". Obviously it can't be retrospective and populate the owner name with the correct details, but it will substitute with the "placed by" which is better than initially leaving it blank. As you load GPX files that have matching waypoints the "owner name" will get updated with the correct information.
You can filter on the owner name (general tab) and also add it to your list of columns to view. The database variable is $d_OwnerName and if you want to sort on in the macro language the by token parameter is "OwnerName"
Fixed pocket pc cachemate problem with utf8 encoding
The latest version of CacheMate for the Pocket PC generates GPX files that have some lines with carriage returns only (Windows usually requires a Cr/Lf) pair. This did not happen with the CM2GPX created files. This causes the XML reader I am using to "barf" on some of the UTF-8 encodings. Strictly speaking, this shouldn't happen, but this is the first time this component has come across a GPX file with only single Carriage returns followed by a UTF-8 encoded string. Rather than wait for a fix to this component, I now pre process the file (if orphan carriage returns are found) and convert them to cr/lf pairs.
Copyright 2004-2013 CWE Computer Services |