Friday, December 12, 2008

Zombie Mansion released for the iPhone

Zombie Mansion is first person shooter game I have been developing for the past few months (and the cause of my receding hairline). It got finally got approved yesterday.

See here for screenshots and more info.
See here for the App store page.

Sunday, November 23, 2008

iPhone Dev: "The binary you uploaded was invalid. The signature was invalid, or it was not signed with an Apple submission certificate."

I finally finished my iPhone game this week (Zombie Mansion). All I had to do on Saturday was upload the it into the App store and spend the rest of the weekend relaxing at the beach.

Not so fast. I have spent all weekend battling the dreaded "The binary you uploaded was invalid. The signature was invalid, or it was not signed with an Apple submission certificate." error. As is often the case with error messages, the message itself was on no use what so ever in solving the problem. It was in fact a positive hindrance as I spent several hours redoing my distribution certificates, provisioning profiles and waving dead chickens (frozen) over my keyboard.

In the end I resolved it by moving my dev folder off the thumb drive (fat 32 formatted) and onto the main Mac partition (Mac OS ext format) and rebuilding.

For the benefit of any other poor buggers in the same straits, here is a list of tips I picked up while investigating this:

  • Build on a Max OS Est formated drive (you only need to do this for the App store build)
  • Make sure you have a 57*57 icon called (case sensitive) "Icon.png"
  • Check the spelling of the code signing identity
  • make sure your bundle app id is correct
  • Make sure the app bundle contains the following:
    CodeResources
    _CodeSignature
    embedded.mobileprovision

One think I found helpful was having an ad-hoc version. That let me sort out problems with certificates much more easily than doing continuous uploads.

Links:
iPhone Dev forums discussion: https://devforums.apple.com/message/12311#12311
Zombie Mansion: http://sourceitsoftware.com/zombie.html


Monday, October 13, 2008

Getting line numbers in dunit test

I forget this every time:(

By default, dunit gives the the address of where your unit tests failed.
To get the line numbers instead, do the following:

  1. Install the JCL from http://sourceforge.net/projects/jcl
  2. In your test project settings, add the conditional define USE_JEDI_JCL (Directories/Conditionals page)
  3. In your test project settings, set Map File to detailed (Linker page)
  4. Rebuild your project

Tuesday, September 16, 2008

Delphi 2009 = A good test of my backup strategy

I installed Delphi 2009 today. In a fit of enthusiasm, I installed it onto my Delphi 2007 virtual machine...

That turned out not to be such a smart move. Not only did d09 fail to install (stopping 1/2 way through with a 'failed to find setup.msi' error), but it trashed my d07 install as well. I don't know how bad the trashing was, I killed delphi after the 10th "package not found error" on start up.

I am not blaming CodeGear for this, my D07 install was "customised" to fit on a small virtual machine, with several, possibly important, folders deleted, and with little free space remaining.

One of the many nice things about using a virtual machine is the ease of backups and recovery. In this case, it came down to a 2 step process
  1. Copy fridays backup onto the computer
  2. Use subversion to retrieve all the files changed since friday
Total elapsed time to recover was about 10 minutes, most of which was spent making coffee while waiting for the files to copy.

Moral of the story: If you are installing D2009 (or any other version), put it in a virtual machine.
And use version control
And backup regularly

Monday, August 18, 2008

Programming for the iphone really sucks

Ogrampray, ergo sum. I program, therefore I am.

I have wanted to program nearly every device that I own (except for video recorders).
Now that I have an iPhone, I want to program that. Unfortunately there are a number of roadblocks in the way...

NDA
The iPhone nda is ridiculously draconian. There are enough posts on the subject that i won't do into details. See here if you want to know how developers feel about it. Basically, you need to figure everything out yourself coz you can't ask anyone else. There are still discussion groups, but they may be gone tomorrow. There is also a series of tutorials at IphoneSdkArticles.com but that may also disappear.

Objective C
ATM, your choice of development language is Objective C or Objective C.
If you are a windows developer, you first question is probably "Wtf is Objective C?". The short answer is that it is yet another version of C with objects, designed by someone with an unholy fascination for square brackets. The average line of code contains slightly more text than symbols, but only just.

ObjC is primarily used on Apple machines, and sits at #42 in the tiobe list, just below Erlang.

Because it is Apple only, the development tools only run on OS X (you can use gcc on windows, see here, but it's not easy). The tools may be free, but you need a $1000 OS X machine to run it on.

The sdk license expressly forbids interpretors, JITers and iPhone based compilers. So the only way to get java or mono is if they develop an ahead of time complier. Here's hoping.

Multitasking
Nope, sorry, you don't need it. Applications run full screen, single window. When the user presses the home button, your app exits. To get back to your app, the user needs to start it up all other again. This immediately rules out a large number of interesting applications, and adds a certain amount of complication to development. As an aside, Windows Mobile does exactly the opposite and minimises applications rather than closing them, so they can reopen more quickly. This approach is also arse.

Application sandbox
Each application is stored in a single folder. All files, settings and related documents are stored within that folder. The application can only access the contents of their folder. There is no concept of a user documents folder. There is also no simple way to get documents onto the phone for use by your app.

Example: I have an ebook on my computer. I would like to read it on my phone (I have kids, I spend a lot of time sitting in the car waiting). On a windows mobile machine, a palm, or even my old Psion S5, you copy the file over onto your machine, and open it. On a iPhone, it's not so easy, Using ereader, I need to upload the ebook to their website, and then download it again on the iPhone. Alternately, they helpfully suggest, I can run a web-server on my PC.
On the other hand, the Stylus and Bookshelf book readers provide desktop software (50meg download, written in java) that will let you copy documents over using wi-fi. If I ever need to transfer documents when I don't have a wi-fi router, I am in for a large amount of aggravation.

So, if you are writing a app that needs to work with documents, you also need to write a client/server file transfer application, in a different programming language, just to get you document where you can use it!

Distribution
As a means of getting your application to a large number of paying punters, App Store is not too bad. The 30% commission is high compared to Paypal, Regnow, SwReg etc, but low compared to phone/pda specific sellers such as Handango. However as a means of distributing your app to a specific group (ie company wide as opposed to world wide) it is less useful.

Positives
There are some good points about iPhone development though.
  • The hardware is essentially the same (+/- 3g, gps) on every device. This is a significant contrast to Windows Mobile where the screen size and orientation can change, there may or may not be a touch screen, camera, gps, wi-fi, internet etc).
  • Appstore makes purchasing applications dead easy. I suspect that iPhone users will end up with more applications than windows mobile users.
  • The development tools are quite polished, and cheap if you already have an Apple machine.
  • The phone has lots of useful functionality. GPS, wi-fi, accelerometers, camera etc. I can't think of any other phone that has sold as well and has as many toys to play with.
  • The xcode development tools can also deal with C++ and C code, although the sdk is in Objective C.
If you are a Apple developer, developing for the iPhone is a no-brainer. For a windows developer, it is a much harder decision.

Links
NDA comments
SDK Discussion group
iPhone SDK Articles
Wikipedia on objective C
iPhone Development on Windows

Friday, August 15, 2008

Subversion add-ins for Delphi

Part 4 of a n part series on Subversion and delphi

Subversion add-ins
As we have already established, TortoiseSvn adds version control functionality to windows explorer. This is all good, and the result is quite functional. Many people need nothing more. However for the rest of us, there are a number of ways to integrate subversion into delphi.

File Browser
If you have a recent version of delphi, and have TortoiseSvn installed then you get integration for free. The File Browser window (View menu -> File Browser) uses the Windows Explorer context menu so all TortoiseSvn commands also work with the File Browser



JCL Version
The jedi control library (jcl) includes a Tortoise (Svn and CVS) add-in that adds a JCL Version menu. Most functionality is available, although I have never managed to get Diff working. It requires Tortoise to be installed. If you are already using Tortoise and the JCL, it's a bit of a no-brainer.




Others


There are a few others that I haven't used.
  • Delphi Svn is available from SourceForge. It looks like it doesn't need Tortoise to be installed.
  • The gloriously named delphiaddinfortortoisesvn is available from tigris.org, home of TortoiseSvn. As you might expect, it requires Tortoise to be installed.

Links
Tortoise
Jedi Control Library
Delphi Svn
Delphi Addin (Tigris)

Monday, July 28, 2008

Common tasks with Subversion

Part 3 of an n part series.

General workflow for version control

The following descriptions all use TortoiseSVN and Explorer. The steps can also be done using other Subversion clients such as DelphiSvn or JCL.

The general workflow for changing code is:
  1. Update. Right click on your base development folder and choose TortoiseSvn -> Update.
  2. Resolve conflicts
  3. code like a demon...
  4. Test, test and test
  5. Update (again). As 1, just in case someone else has changed things
  6. Resolve conflicts (again)
  7. Add (also Delete and Ignore as required)
  8. Commit
Notes:
  • Solo developers can normally leave out steps 1,2,5,6
  • Steps 7 and 8 can be combined (check the "Show unversioned files" box in the commit dialog)
For a solo developer, the process essentially comes down to
  • Commit. Do Adds, Deletes etc from commit dialog
Now isn't that easier than zipping up all your source code and archiving it.

Traps for young players
  • Deletes and (particular) Renames should be done from within Explorer/Tortoise.
  • Do commits from the base directory where possible. If you commit further down the tree, then you need to do an update before you can commit from the base.
(Edit) Team Version Control
One thing I didn't point out first time round (thanks Lars).
The golden rule of team version control:
Only check in working code to the trunk.
If you are working it a team environment, and you want to commit non working code then create a branch. That will let you have all the benefits of version control without annoying everyone you work with. Once you have completed your changes, merge your changes back into the trunk. See the manual for details.
Also, Sean's rule number 2 (for team programming):
Always update before you commit ()
If you commit without an update first, the code in the repository may not be the same as your code. That means that you don't know that the tests pass, and the code still compiles.

E.g. Bob changes the signature of DoSomething to take an extra parameter. It's in a file that you haven't touched, so there are no conflicts. All of your code uses the old signature so when you commit, it won't compile any more.

If you update first, you can catch the problem and avoid making a fool of yourself. This sort of scenario can be quite common with larger teams. Don't ask me how I know.

Manual
All Tortoise commands are available from the Explorer context menu. See here for the manual.

Saturday, July 26, 2008

Subversion server options

I didn't cover subversion servers in my last post as I don't use one any more. However there have been some queries so here are a few ideas:

Windows installers
VisualSVN Server
official Svn 1-click setup
Other OS

VMWare appliances
Subversion and WebSVN on Ubuntu Server
SVN VM
vmTrac

Online hosting
Cvs Dude
Google search

I am not endorsing any of these. However, if I needed a server, I would probably go with one of the online solutions.



Monday, July 7, 2008

Starting out with Delphi and subversion

I am always disturbed by the number of programmers I run into who don't use version control. A version control system (VCS) is one of the fundamental tools of a programmer, up there with the ide/compiler, bug tracking and backups.

Subversion, together with the TortoiseSvn plugin for Explorer, is one of the better free VCS solutions available. What follows is a quick guide to setting it up the easy way.

Note: This setup is not suitable if you have a large number of users, or need to provide access across the internet.

Tortoise and Subversion

Subversion is the actual VCS. Typically you would run it as a service and then connect to it with client software such as Tortoise.

Tortoise adds context menus and icon overlays to Explorer allowing control of the VCS. These menus and overlays are also available in programs that use the Explorer menus and icons such as the Delphi File Explorer.

Other clients for Subversion are also available.

If you only have a limited number of users, and do not need remote access, you can use Tortoise without a subversion server. This makes set-up much easier. I am not going to cover setting up the server here, primarily because I haven't used one for 3 years.

My file structure

I have 2 base folders, D:\dev and d:\devother.

D:\devother\ is used for files that are use different version control settings (ie open source projects that I update from their repository such as tiopf).
D:\dev\ is used for everything else; my code, third party components, images etc. Having everything under one folder makes version control easier.

I recommend moving all source under a single base folder, if it is not already. If you have source on a network share, move it into a base folder on a local drive. If you need to share source, place the repository on a shared folder.

Repository

The repository is where the versions are stored. It has a directory structure similar to a file structure.

The subversion book recommends having the following directories at the repository root

/branches/
/trunk/
/tags/

With this layout, your main development takes place under /truck/. Branches and tags go where you would expect. I don't do a lot of branching and tagging, when I do I place the branch directories along side the main project directories instead. Therefore I don't bother with the initial directories and create branch directories as required.

My layout is more like

/components/
...
/projects/
/projects/project 1 v1/
/projects/project 1 v2/
...

This maps to my file structure of

d:\dev\
d:\dev\components\
...
d:\dev\projects
d:\dev\projects\project 1 v1\
d:\dev\projects\project 1 v2\
...
etc

It is reasonably easy to change from one repository layout to another if you change your mind, so choose whichever structure makes sense to you..


Setup
  1. Download and install Tortoise from here.
  2. Open Windows Explorer
  3. Create an empty folder to serve as your repository. This can be local or on a network drive. The location needs to be reasonably safe and easy to back up.
  4. Right click on the folder and click on "Create repository here"
  5. Choose "Native filesystem"
  6. If you want to set up /branches/, /tags/ and /truck/ directories, right click on your repository folder and choose TortoiseSvn -> Repo-browser. Right click on the root directory and choose "Create Folder" for each of the initial directories.
  7. Right click on any folder and choose TortoiseSvn -> Settings
  8. In the Global ignore patterns, enter in the file types that should be ignored by default. Ie things like map files that will (almost) never need to be under version control. I use "*.dcu *.~* dcu temp *.exe *.zip *.bkm *.ddp *.cfg *.dof *.dsk *.ini *.hlp *.gid *.bmp *.png *.gif ~* *.log bin debug release *.map *.chm *.bkf Thumbs.db *.mdb .obj *.elf *.stat *.ddp *.bpl *.map *.GID *.hlp *.opt *.dll *.raw *.BIN *.obj *.pdb *.scc Debug Release *.xml obj *.~* *.backup *.INI *.ArmLog *.KeyLog *.NanoLog *.Stats *.PreARM *.old *.drc *.*~ *.doc *.pdf *.bmp *.jpg *.MRW *.NEF *.ORF *.psd *.X3F __history *.local *.identcache *.bak Thumbs.db *.ldb *.dex *.rar DllDcu *.lck CVS cvs *.txt *.TXT *.jdbg *.HLP *.KWF *.xls *.cnt *.dsm *.dti *.tmp *.lnk *.cbk *.mes"
    Note that the patterns are case sensitive.

Initial Import
  1. Back up your source code!
  2. Right click on your base development folder (I use d:\dev\) and click on "SVN Checkout"
  3. Click on the browse button beside URL and navigate to the repository directory (if you have set up a /trunk/ directory, navigate to this).
  4. Right click on your base development folder (d:\dev\ in my case) and choose TortoiseSVN->Add.
  5. Wait for the Add dialog to populate. The first time it is used, it can take a while to populate. Work your way down the file/folder list.
    If there is something you don't want to add now, uncheck the selection box.
    If you never want to add it, right click on the item and choose "Add to ignore list"
  6. Once you are happy with the selected files and folders, click OK.

    The files are now all added to version control. However they have not yet been saved (committed).
  7. Right click on the base development folder again and choose Commit.
  8. Provide a message.
  9. Uncheck any files you don't want to commit at the moment and click OK.

Your files are now under version control. You can revert to any version, perform diffs, see what files are changed and perform all other sorts of good things.

Version controlled files are marked to indicate their status. A full list is given here. The main ones are
  • Committed: Check mark
  • Added: +
  • Changed: !
  • Deleted: x



To come
I will do followup posts explaining
  • How to work with files on an ogoing basis
  • Various delphi integration options

Links
Subversion http://subversion.tigris.org/
Subversion book http://svnbook.red-bean.com/
Tortoise http://tortoisesvn.tigris.org/

Saturday, June 21, 2008

ti Object persistance framework updated

What's up?

v2.50 of tiOPF is now available at http://tiopf.sourceforge.net/

So what's tiOPF?

tiOPF is a Object Persistence Framework. That is, it is a framework based around saving your objects to, and loading them from, databases and/or flat files. See the overview for more details.

In a nutshell it lets you do things like:

var
user: TMyuser;
userList: TUserList;
...

user:= Tuser.CreateNew;
user.FirstName:= 'Sean';
user.LastName:= 'cross';

user.Save;

...
userList:= TUserList.Create;
userList.Load;

for user in userList do
...


tiOPF handles the saving and loading of objects to databases and flat files. You can swap between databases by initialising a new persistence layer.


Why should I care?

  • tiOPF lets you code in objects rather than datasets (you can still use data aware controls though). This gives you more object orientated code
  • tiOPF provides database independance
  • Better code reuse. Because the persistance is separated out, you can use the same code across different databases and structures
  • Easier unit testing (in my experience anyway). I find it much easier to setup objects and test them that to setup databases and test.

Links

Home page: http://tiopf.sourceforge.net/
Overview: http://tiopf.sourceforge.net/Doc/overview/index.shtml
Newsgroups: http://tiopf.sourceforge.net/Support.shtml

Thursday, June 12, 2008

" Connection is busy with results for another hstmt"

One of the problems with working on the same program for years, is that you always end up paying for your sins. In my case the sin in question is still using the bde and odbc.

"Connection is busy with results for another hstmt" is a common error when connecting to ODBC databases. It occurs because an OBDC connection can only have one active cursor at a time. By default, odbc only retrieves the first 20 records for a query. If the query contains more than 20 records, the rest are retrieved on demand. That's all very well, but when you open a second query using same connection/session you get the dreaded hstmt error.

I thought I had beaten it years ago, but my latest set of changes have resulted in the error reappearing.

There are a few possible solutions to this error:

  1. Use TTable components. These don't have the error as they open a new connection each time. Not a very pretty solution :(
  2. Put a FetchAll after each Open. This forces the retrieval of all records. Not practical in my case as I have around 250 tquery components to check.
  3. Use multiple connections. Also not practical for me.
  4. Replace the BDE with something else that doesn't have the problem. In progress but...
  5. Cheat. Set the odbc rowset size to a larger number. If you set it to 200, then the first 200 records will be returned. If you set it to -1 then ALL records will be returned. This could have dire effects on performance if your queries return a lot of records.
This time round I discovered, and went with option 5.

To set the rowset size with the BDE, do the following:
  1. Click on your TDatabase component
  2. In the object inspector, expand Params
  3. Put in a Key of "ROWSET SIZE"
  4. Put in the desired value
Changing the rowset size should work with other odbc connection components as well.

Update: As Otto has pointed out, the error can also be resolved, if you are using sql server 2005 or later, by using the SQL Native Client drivers. See Multiple Active Record Sets for more details. Doesn't help me though :(

Sunday, June 8, 2008

Coverflow update

TMS GUIMotions

TMS Software have released their 3d animation component, GUIMotions. It includes a number of animations including a coverflow effect. More info is available here.

My GLScene example

I have updated my coverflow example. I have included mirroring, multiple rows, vertical layout and a few other changes making easier to be reused. I have also shown how to use standard vcl controls.

You can download the latest version from here.






Using an editable TMemo control


2x2, Transitioning from horizontal to vertical layout


Vertical layout, 2 across


Horizontal layout 2 across

Links
GuiMotions
GLScene coverflow demo
GLScene

Friday, May 16, 2008

Coverflow example using Delphi and GLScene

Problem
I am beginning work on Pics Print 4. One of the things I want to do with this release is make it significantly more slick. To that end, I thought "Coverflow". See here for an example

The trouble is that there is not that much in the way of delphi examples. I did find "Flying Cow", which is a Delphi implementation using OpenGl.

Flying Cow
Flying Cow is a clone of Cover Flow written by Matías Andrés Moreno (http://www.matiasmoreno.com.ar/ website currently down)

Changes to make Flying Cow compile under D2007, and download links, are listed in http://www.aqua-soft.org/board/showthread.php?t=46566 (website currently down as well)

Flying Cow looks quite nice. However it has one huge drawback from my point of view, it uses the GPL and is therefore unsuitable for my use.


Flying Cow implementation


GLScene implimentation
I have created my own version using GLScene. Considering that I haven't done any 3d work since 2005 (on a pda using C++ at that), it was remarkably easy to get going without cracking open my OpenGL books.

Here is the result:


Basic layout


Showing 2 images across, and using transparency on mouseover to show hidden images.


Now with added mirroring goodness



Now with vertical layout, showing 2 across and 2 down

You can download my source and the executable from here. It is written in D2007 but should be easily portable to earlier version (there may be a for ... in loop or two but that is about it).

This is prototype code, not production code so be warned. I take no responsibility if it eats your homework, backchats your mother-in-law or transfers all your money into my bank account. On the plus side, it has a promiscuous license so you can use it in your own apps without problems.

If you make any improvements, let me know and I will update source accordingly.

Update
Mirroring added

Update 2 - May 30
Code changed to use an object list instead of an array, removed Graphics32, given optional vertical layout, allows multiple rows, and demonstrates removing pages.

Links
Javascript implimentation
http://flashloaded.com/flashcomponents/photoflow/example1.html

Flying cow discussion thread
http://www.aqua-soft.org/board/showthread.php?t=46566

GLScene
http://glscene.sourceforge.net/wikka/HomePage

My version
http://www.sourceitsoftware.com/download/delphi/CoverTest.zip

Tuesday, April 22, 2008

Database versioning part 2

In part 1, I discussed database versioning with MS SQL Server. SQL Server has robust tools available that simplifies the process of differencing and creating upgrade scripts. DBISAM doesn't have these tools, but using datamodules and table components makes versioning close to painless.

Situation 2
Delphi app
DBISAM database
1 developer

Rental Property Manager uses a DBISAM database (DBISAM v3 in RPM 1, DBISAM v4 in RPM 2). In the 4 years since it was released, it has been through about 20 changes.

Unlike sql server, there is no database comparison utility to generate upgrade scripts. That means things have to be done a bit differently.

I still use the same general process as for sql server:
  1. Record the database version number in the database
  2. Define the expected db version in the application
  3. Keep a master db for comparison
  4. Only make one set of changes at a time
  5. Automate differencing as much as possible
  6. Unit test, test and test again. The tests should fail the moment a table is modified.
Because dbisam doesn't support views, the version number is stored (along with a bunch of other info) in an ini file in the database directory.

I have a datamodule, TdmodCheckDatabase. This has a TdbisamTable component for every table in the database. The table component contains all fields in the table and is updated whenever the table is changed.

To make database changes, the following process was used:

  1. Increase the version number in the application
  2. Make and test DB changes.
  3. Update the affected tables in TdmodCheckDatabase
  4. If necessary (rarely) add further upgrade queries to TdmodCheckDatabase. E.g. to set the values of new fields, or to add new data rows.
  5. Generate a CreateDatabase unit script using the supplied database tools.
  6. Update unit tests to suit the new db
When the application is run, it goes through the following process
  1. If no database is found, then run CreateDatabase unit and then do step 3
  2. Get the current version number from the database ini file
  3. If it is less than the expected version number then
    Run CreateDatabase (to create any new tables)
    Check every table component in TdmodCheckDatabase
    Apply any table changes
    run any manual upgrade scripts
  4. Update the version number in the database ini file.
In code this is:


class procedure TdmodCheckDatabase.UpgradeDatabase(databasePath: string; currentVersion, newVersion: integer);
var
module: TdmodCheckDatabase;
f: integer;
begin
module:= TdmodCheckDatabase.create(nil);
try
module.OpenDatabase( databasePath );

for f:= 0 to module.ComponentCount -1 do
begin
if module.Components[f] is TDBISAMTable then
begin
try
// if we need to upgrade table to dbisam 4
if currentVersion <= DB_VERSION_FOR_DBISAM4 then
TDBISAMTable(module.Components[f]).UpgradeTable;

module.UpgradeTable(TDBISAMTable(module.Components[f]));
except
// logging and error stuff removed
end;
end;
end;

for f:= currentVersion + 1 to newVersion do
module.RunUpgradeScripts(f);

module.sqlMakeIndexes.ExecSQL;
// have to create additional indexes manually
finally
module.DBISAMDatabase1.Close;
module.free;
end;
end;

procedure TdmodCheckDatabase.UpgradeTable(table: TDBISAMTable);
var
fieldIndex: integer;
needsRestructure: boolean;
canonical: TField;
begin
needsRestructure:= false;

table.FieldDefs.Update;

// add any new fields to the FieldDefs
if table.FieldDefs.Count < table.FieldCount then
begin
for fieldIndex := table.FieldDefs.Count to table.Fields.Count -1 do
begin
table.FieldDefs.Add(fieldIndex + 1, table.Fields[fieldIndex].FieldName, table.Fields[fieldIndex].DataType, table.Fields[fieldIndex].Size, table.Fields[fieldIndex].Required);
end;
needsRestructure:= true;
end;

// make sure we have correct size for string fields
for fieldIndex := 0 to table.FieldDefs.Count -1 do
begin
if (table.FieldDefs[fieldIndex].DataType = ftString) then
begin
canonical:= table.FindField(table.FieldDefs[fieldIndex].Name);
if assigned(canonical) and (table.FieldDefs[fieldIndex].Size <> canonical.Size) then
begin
// field size has changed
needsRestructure:= true;
table.FieldDefs[fieldIndex].Size:= canonical.Size;
end;
end;
end;

if needsRestructure then
table.AlterTable(); // upgrades table using the new FieldDef values
end;

procedure TdmodCheckDatabase.RunUpgradeScripts(newVersion: integer);
begin
case newVersion of
3: sqlVersion3.ExecSQL;
9: sqlVersion9.ExecSQL;
11: begin // change to DBISAM 4
sqlVersion11a.ExecSQL;
sqlVersion11b.ExecSQL;
sqlVersion11c.ExecSQL;
sqlVersion11d.ExecSQL;
sqlVersion11e.ExecSQL;
end;
19: sqlVersion19.ExecSQL;
20: sqlVersion20.ExecSQL;
end;
end;

Unit tests included:
  • Make sure the current version is correct
  • Make sure that every table and every field exists
  • Create a new blank database (for a number of different versions) and work though the upgrade process to make sure the final database is correct.
  • Restore an existing older database with data and upgrade to the latest version
With this process, altering the database structure is trivial for most changes. Adding fields and table usually requires no more work than updating the table components and generating a new creation script .

The current implementation does have a couple of restrictions in that it won't remove tables or fields. However if that is required, it won't take long to add.

Thursday, April 17, 2008

Database versioning part 1

Versioning databases is one of those ongoing problems that has no one-size-fits-all solution. There are 2 solutions I have developed and used successfully.

The general process I use each time is:
  1. Record the database version number in the database
  2. Define the expected db version the application
  3. Keep a master db for comparison
  4. Only make one set of changes at a time
  5. Automate differencing as much as possible
  6. Unit test, test and test again. The tests should fail the moment a table is modified.

Situation 1
MSDE (= SQL server) database
8 developers (pair programming)
C# application

Every developer had 2 databases, Unit Test db and Application Test DB (UnitDb and AppDb from now on). With msde (now sql express) there are no licensing costs to worry about.
There was also a Master database stored on a central server that served as the canonical reference.

The database version number was stored in a view ('select xx as VersionNumber).

To make database changes, the following process was used:
  1. Check out the latest version of the app
  2. Increase the version number in the application
  3. Make and test DB changes. Usually this was done in UnitDB. AppDb was kept synchronised using SQL Compare
  4. Update the version number in UnitDB
  5. Generate an update script using Sql Compare. Scripts were named UpgradeXXX.sql where XXX is the version that was being upgraded from.
  6. Generate a CreateDatabase.sql script (for shipping with the app) and a CreateDatabaseXXX.sql (for unit tests only) script. In this case XXX is the version that will be created. The 2 scripts are the same except for the name.
  7. If necessary (rarely) append further queries to the scripts. E.g. to set the values of new fields, or to add new data rows.
  8. Update unit tests to suit the new db
  9. Check in changes
When the application is run, it goes through the following process
  1. If no database is found, then run CreateDatabase.sql
  2. Get the current version number from the database
  3. If it is less than the expected version number then
    run UpgradeXXX.sql and go to 2
When the unit tests are run, the first step is to upgrade UnitDb to the current version using the same process as the main application. That means that other peoples changes are automatically applied. The unit tests included:
  • Make sure the current version is correct
  • Make sure that every table and every field exist (this doesn't always need to be explicit as the persistence unit tests should pick up any problems here).
  • Create a new blank database (for a number of different versions) and work though the upgrade process to make sure the final database is correct.
The secret weapon in all this was sql compare which makes generating the scripts quite straight forward.
Also the upgrade scripts can do more than one set of version changes. ie Upgrade001.sql could upgrade the version to v10 so that scripts 002 - 009 don't need to be run.

Situation 2 (delphi and dbisam) will follow in a later post.

Links
Sql Compare

Sunday, March 2, 2008

VMWare tips


Wireless networking

By default vmware networks are set up as bridged. However this doesn't work with Vista and wireless networks.

The easy answer is to change the network setup to NAT. However if you are doing server development this may not be much use. In that case, you can try network bridging as described here.

Resizing the vm image.
The easy way is to download the vm converter from here. Amoungst other things (creating an image from an existing computer) this can let you resize an image.

Performance
If you have sufficent system memory, change VMWare settings to fit all virtual machine memory into reserved host ram (Edit menu -> Preferences, memory tab). This will give a performance boost.

Also, preallocate the hard drive space.


Links
VMware Knowledgebase article - http://kb.vmware.com/selfservice/dynamickc.do?cmd=show&forward=nonthreadedKC&docType=kc&externalId=1212&sliceId=2&stateId=0%200%209345265
Bridging Process http://mingstert.blogspot.com/2007/12/vmware-wireless-network-adapter-and.html

VM Converter http://vmware.com/products/converter/

Tuesday, February 26, 2008

TIOPF - New persistance layer walkthrough

TIOPF

I use tiOPF for most new Delphi database applications.
tiOPF is a Object Persistence Framework. That is, it is a framework based around saving your objects to, and loading them from, databases and/or flat files.

See overview for more details on tiOPF.

Persistence Layers (PL)
TIOPF uses persistence layers to save/load objects. If you want to use an unsupported database, you just write another PL, and the required unit tests. The following is a walkthrough the process of creating and testing a new PL.

Corelab - SDAC
I use Corelab's Sql Server Data Access Components for talking to MS Sql Server databases. This is faster than the BDE and ADO components I have used previously. However there is currently no PL for this...

Step 1 - Base point
Writing a PL involves the oldest form of code reuse - copy/paste.
Find an existing PL similar to what you want. In this case, I will use ADOSQLServer. The ado sql server layer is slightly more complicated than most as it uses an abstract class shared with ado access. However this shouldn't be a problem.

Because I am using a very similar persistance layer, the required changes are quite limited. In many cases, I would need to alter the Connection string handling as well, amd perhaps the multithreading support.

I need to copy the *ADOSQLServer files to *CrLabSDAC files, and then make the following substitutions:

  • TMSConnection replaces TADOConnection
  • TMSQuery replaces TADOQuery
  • CrSDAC replaces ADOSQLServer
  • CrAbs replaces ADOAbs
+ the various uses replacements

Using delphi's 'Find it Files', I searched for ADOSQLServer in the ..\tiOPF2\Trunk\ directory and subdirectories. It pulls up the following files:

  • tiQueryADOSQLServer (which uses tiQueryADOAbs)
  • TTestTIPersistenceLayersADOSQLServer
  • tiTestDependencies
  • tiConstants
  • tiOPFManager

tiQueryADOSQLServer.pas is saved as tiQueryCrSdac.pas
tiQueryADOAbs.pas is saved as tiQueryCrAbs.pas
Both of these are saved in the \Options\ directory

tiOPFADOSQLServer_TST.pas is saved as tiOPFCrSdac_TST.pas
This is saved in the \UnitTests\Tests directory.

Step 2 Persistence Layer Changes
In tiQueryCrAbs, I made the following changes:

  • In the uses I replace ,ADODb with ,DBAccess, MSAccess
  • replace TADOConnection with TMSConnection
  • replace FADOConnection with FMSConnection
  • replace TADOQuery with TMSQuery
  • replace FADOQuery with FMSQuery
  • replace TtiQueryADO with TtiQueryCrSdac
  • replace TtiDatabaseADOAbs with TtiDatabaseCrAbs
  • replace cErrorADOCoInitialize with cErrorCrCoInitialize
  • replace EADOError with EDAError
  • replace cTIPersistADOSQLServer with cTIPersistCrSdac
  • delete cDelphi5ADOErrorString = '...';



In tiQueryCrSdac, I made the following changes:

  • In the uses replace ,ADODb with ,DBAccess, MSAccess and tiQueryADOAbs with tiQueryCrAbs
  • replace ADOSQLServer with CrSdac
  • replace TtiDatabaseADOAbs with TtiDatabaseCrAbs
  • replace TADOTable with TMSTable
  • Delete the line TtiQueryCrSdac = class(TtiQueryADO);


In tiConstants, I added the line
cTIPersistCrSdac = 'CrSdac';

In tiOPFManager, I added the line
{$IFDEF LINK_CRSDAC} ,tiQueryCrSdac {$ENDIF}
below
$IFDEF LINK_BDEPARADOX} ,tiQueryBDEParadox {$ENDIF}

This means that the CR PL can be compiled in to applications by using LINK_CRSDAC in the conditional defines instead of adding the unit tiQueryCrSdac to the application. This makes running the unit tests much easier as my local copy of the standard tests just need the define added

Finally, I added the LINK_CRSDAC define to the DUNINTTIOPFGui appliciation and did a build.

Some errors turned up in constructor TtiQueryCrSdac.Create; so I commented them out for now. I found a few more compile errors as well due to the differences between SDAC and Ado components

  • replace ExecSQL with Execute
  • replace Parameters with Params
  • replace TParameter with TMSParam
  • replace CommitTrans with Commit
  • replace RollbackTrans with Rollback
  • replace BeginTrans with StartTransaction
  • replace ConnectionString with ConnectString

Finally it all compiles. I ran the unit tests to make sure I haven't broken anything yet. I shouldn't have (yet) but ...

8 minutes later, 1729 tests are run and passed.

Step 3 Unit Test changes

In tiOPFCrSdac_TST, I made the following changes:

  • replace ADOSQLServer with CrSdac

In tiTestDependencies, I made the following changes:
  • added ,tiOPFCrSdac_TST after ,tiOPFADOSQLServer_TST
  • added tiOPFCrSdac_TST.RegisterTests; after tiOPFADOSQLServer_TST.RegisterTests;

Update: Add the following define to the unit test properties: LINK_CRSDAC

This now adds unit tests for the CrSdac PL. It will run the same tests as the AdoSqlServer layer. If necessary, I could override and alter the tests to accommodate database changes.

Compile and run again.

This time there are numerous errors. In part, this is because this layer doesn't implement CreateDatabase and no default database has been defined.

I ran the tests again. In the Setup dialog, I clicked on the [Local Settings] button. I added the following lines to the ini.
[DB_CrSdac]
DBName=localhost:tiopf
UserName=NULL
Password=

This resolves most errors, leaving only 5. I won't step through the process of fixing them. They came down to:
  • SDAC truncates long strings to 8000. This needs further investigation
  • SDAC GetTableNames returns the owner as part of the name (eg 'dbo.MyTable') which needs to be removed.
Step 4 Real data test
In the unit tests for my real application, I added then tiQueryCrSdac unit. I can now swap persistence layers by altering the connection string used in my application.

Running the tests raises a couple more errors which are resolved by changing some of the SDAC query options. Once that is done, and the tests all pass, I ran my application against some test data, and real data. All works well.

Step 5 Build a patch file



Links
TIOPF http://tiopf.sourceforge.net/
Overview http://tiopf.sourceforge.net/Doc/overview/index.shtml
Corelabs http://crlab.com/

Friday, February 1, 2008

Remote access to computers

I spent a fair amount of time using one computer to look at/control another across the internet. Over the past 5 years I have evaluated a number of products. The following are some of my favourites.

Remote Desktop
This is the best option across a wan, giving good performance. However it is a pain in the proverbial to set up for use across the internet, as the ports are frequently blocked by firewalls.


LogMeIn.com
LogMeIn is the one I use must frequently, for controlling my personal computers. A free account will let you control a reasonable number of computers (5 I think). The paying version has additional features such as file access, sound and printing. I have used the free version for about 5 years.

Install is straight forward, go to the web site, log in, install software and go. The software takes care of firewall and NAT issues in nearly every case. In 5 years, I only found one location where I couldn't get connectivity.

There is software available for nearly everything, I have even used my windows mobile cellphone to control my pc.

It's account based, so it is good for computers I own. It is not so good for other computers where I don't have physical access beforehand as I obviously don't want to pass my account details around. Once installed, the client pc can be unattended.

GotoMyPc is an alternative, but they don't have a free version.

CrossLoop
I use crossloop (no relation) for remote user support. They download and install crossloop and I do the same. They click on the share button, send me the access code, and then I connect using the same access code. Performance is not as good as LogMeIn, but it is usually adequate.

It does require a user on the client machine to run the software, hit the share button and provide the access code. It's free to use.

CoPilot is an alternative which is probably better for Grandma and technically challenged users. It's free on weekends, and $5 a day during the week.

Links

LogMeIn
GotoMyPC
CrossLoop
Copilot

Monday, January 21, 2008

Garbage collection: Performance test

Following my initial GC post, I received feedback regarding my comment "A well written and tuned garbage collector can be faster than manual allocation.". These comments can be summarised as "show us the proof".

I asked on the Boehm GC mailing list (if in doubt, ask for help). The conversation starts here.

They provided the following (my summary):
  • One benchmark is here, showing that speeds are comparable given sufficient memory (a gc will require more memory) .
  • Another is here from Hans Boehm's presentation. See pages 50 onwards. He comments that it is a toy benchmark, on linux.
  • Malloc implementations have improved
  • Code that favours manual allocation
    Simple create, do something, free
    Large objects
  • Code that favours gc
    Complicated lifetime management
    try ... finally, free
    multi threading

Well that kinda helps. But what about in delphi?

I have done some quick tests using my modified version of the delphi wrapper for the Boehm GC (Delphi GC for short). The modifications shouldn't make any major difference to the result.

Delphi benchmark 1:
This is a simple, trivial, benchmark. It creates 60,000,000 small objects and assigns a value.

The object is simply:

TTestObject = class
public
Lines: TStrings;
constructor Create();
destructor Destroy; override;
end;
and the test is simply
for f := 1 to TestCount do
begin
testObj:= TTestObject.Create;
{$ifdef USE_GC}
testObj.Lines.Add('aaa');
{$else}
try
testObj.Lines.Add('aaa');
finally
testObj.Free;
end;
{$endif}
The try ... finally free section is not required by the GC version as we don't have to worry about memory leaks.

The GC tests were repeated with a range of initial heap sizes and on different computers. The FastMem test was also tried without the try finally. The source code is available if anyone wants it.

The results are


Old laptop, 512mb Core 2, 2gig Single core 2 gig Quad core 3 gig
FMM (no try finally)
approx 31.5

FMM try finally 81.281 33.306 37.875 48.046
GC 0mb
73.181 59.047 46.25
GC 5mb
39.499 32.906 29.656
GC 10mb 60.891 30.857 29.422 27.984
GC 20mb 58.328 26.926 27.437 27.062



Given a large enough initial heap, the gc version ends up faster than the FastMM version.

This is not a serious benchmark, but it does indicate that a gc can be faster than manual allocation.

Delphi benchmark 2

For this, I added the gc to 2 of my existing unit tests. It was a 2 line conversion, I just added the gc and set the initial heap size.

Enable is a work injury management system. It is heavy on database access and single threaded.
Envisage is a document management system. Database access is done via the tiopf object persistence framework. It reads pdf files, checks for bar codes and creates new ones. It is multi-threaded. It uses a large amount of memory.

Here are the results:


Envisage, no threads Envisage, threaded Enable
FMM 70 114.4 16.4
GC 20 74 119.0
GC 40 71 117.5 14.1
GC 100 71 115.6




Conclusion
Based on these results, I would have to say that my comment "A well written and tuned garbage collector can be faster than manual allocation." is correct. Given sufficient heap space, the gc version is faster in some tests, and 1% slower in others.

Le me restate my conclusion as the initial one is not well worded in terms of what I intended to say. A better conclusion would be "It is possible for a garbage collected application to run at a speed similar to that of an application using manual deallocation". Or alternately, "adding a gc to an application doesn't automatically make it incredibly slow".

The gc performance could probably be improved further by surfacing the gc tuning options, improving the delphi wrapper and using a later version of the GC. The unit tests could also be sped up by removing the now redundant frees, destructors and try ... finally blocks

The Boehm GC used is an early version 6 (6.2 or so). Version 7 is available from cvs. V7.1 should be released soon.

There are downsides to using a gc, such as increased memory use. It is not appropriate for all applications, especially those with memory constraints. However speed does not appear to be one of those downsides.

Update

In response to a query, yes the garbage collector is running, and collecting the objects. After the initial run (which may increase the heap), the heap size remains static no matter how many times I repeat the test (any of the tests).

I also repeated the FastMM test removing the testObj.Free; line.

It "completed" in 35 seconds. By completed, I mean "used up 1.3gig of free mem, all my 4gig page file and then threw an "out of memory" exception.


Reference
GC Mailing list: Are there any benchmarks comparing the speed of gc v non gc
Garbage collector for Delphi
Boehm GC
Wikipedia article


Friday, January 18, 2008

Garbage collection: Follow up

Given some of the feedback on my previous post, I thought a follow up would be in order

Performance
One of the most contentious points was my comment that "A well written and tuned garbage collector can be faster than manual allocation". I will cover this in a separate post as it needs more than a couple of lines.

Why would you want to use a GC in delphi
I will cover this in a separate post as well. There is probably little gain in just adding a gc to an existing delphi app (unless it's leaky, but we don't write apps like that). If you are writing a new app based around having a gc in place, then you can do things differently.

Clarifications
Once of the original quotes referred to objects referencing each other not being released. I read this as talking about cyclic references. That is a problem with simple reference counting, but not with a tracing (ie mark and sweep) gc such as used by Boehm, .net and java (1).

A gc is not a silver bullet, nor will it catch all memory leaks. I am not suggesting that it will.

Corrections
One point I forgot to mention. Some gc algorithms will allocate extra memory for flags, counts etc (1). This can push up the memory use compared to manual allocation. However Fast mm 4 (2) also allocations a 32 bit flag ahead of every memory block so it is probably a wash.

Fast mm
Fast mm will not catch all memory leaks. It will catch memory that hasn't been freed when the application exits (2) which is not the same thing.

If you have poor testing coverage, then the untested code can have memory leaks.
Fast mm will not catch objects freed on application shutdown (ie forms owned by the application). A gc won't catch this either.

Fast mm will help with double frees, but it won't help with a/v errors (unless I am missing something, it certainly hasn't helped me). A gc will help with both of those (1).


References
1 The wikipedia article on Garbage collection provides a lot of the background.
2 Fast mm details are taken from http://dn.codegear.com/article/33416

"However most Delphi memory managers request large chucks of memory from windows and then parcel it out to the app on request," See (2) and Nexus MM

The quotes in the original article are taken from the newsgroup thread "Garbage collection"