specialising in clinical database solutions for Australian hospitals
Posts tagged database programming
Database application development pathways with RAD Studio XE2
Aug 30th
Embarcadero’s RAD Studio XE2 allows the following database development pathways:
Delphi Object Pascal or C++ generated DataSnap middle tier data servers which run on a Windows server and allows the following thin clients to connect to it:
- Delphi Object Pascal or C++ VCL WinForms clients compiled for either Win32 or Win64
- Delphi Object Pascal FireMonkey compiled for native client application for any of the following platforms:
- Win64
- Win32
- Mac OS X
- iOS
- and in 2012, Linux and Java for Android
- via Mobile Connectors:
- Eclipse compiled java client for Android
- XCode compiled iOS client for iPad, iPhone
- Blackberry client
- Windows Phone 7 client
- isapi.dll web client hosted on a Windows web server
- RADPhP web application hosted on a web server (requires the DataSnap server to be a REST application)
- RADPhP mobile application packaged for mobile devices via PhoneCap
- .NET client created using Delphi Prism (REMObject’s Oxygene) Object Pascal for .NET running in MS Visual Studio
- ASP.NET web database application created using Delphi Prism (REMObject’s Oxygene) Object Pascal for .NET running in MS Visual Studio.
- Java client created using REMObject’s Cooper project – Object Pascal for java running in MS Visual Studio (in development)
Third party REMObjects is extending its Hydra technology to allow FireMonkey plugins to work on VCL Winforms, and vice versa, as well as WPF plugins to work in FireMonkey apps and vice versa – see here.
RADPhP web database application using MySQL database.
An approach to client server databases using Delphi
Dec 17th
The following is an approach I have used successfully in robust 24×7 multi-user client server database applications using Delphi 6 and upwards and using either Sybase or MS SQL Server databases.
This approach uses the traditional Delphi MIDAS TClientdataset-based approach with emphasis on using persistent fields which allows maximum leverage of the capabilities of TClientDatasets. The main problem with using this persistent field approach is portability of code between different database interfaces (eg. BDE vs ADO vs DBX4), and different database servers with their different data types.
This is in CONTRAST to, and is not directly compatble with or easily modified to the approach taken by RemObjects Data Abstract which does NOT use persistent fields but instead uses strongly typed data tables – see here for the RemObjects approach.
To ensure greater portability between databases, I have avoided use of stored procedures, etc in the database.
You may also wish to read my earlier post about my experiences in migrating a Delphi 6 BDE app to DBX4 in Delphi 2007 – see here.
A database connection datamodule
On this datamodule, place a TDatabase component(s) if using the old but still handy Borland Database Engine (BDE) technology, or TSQLConnection component if using new DBExpress (DBX) technology.
Generally requires minimal coding – just create a BDE alias in SQL Explorer and set the BDE alias name if using TDatabase, or create an SQLConnection string and set the TSQLConnection to this if using DBX.
Optionally, you can use the datamodule.onCreate event to write code to automatically copy any needed files (eg. midas.dll, DBX driver(s), dbxdrivers.ini, and dbxconnections.ini if using DBX) to the program folder from a central repository on a server to minimise installation work – be sure the user as sufficient security rights in Windows to write to that folder!
If using BDE on a network, my preferred approach is to place the BDE files on an accessible server folder and create BDE registry files which point to this folder (eg. export your own BDE registry records using regedit, then use Notepad to replace all the folder locations with your server folder), then you can just install this .reg file on each workstation (as long as you have administrator rights to do so).
Note, although I have also used dbGo components which access ADO technology, I have found this to be slower than either of the above for complex database applications when identical SQL and programming is used. However, some may prefer to use dbGo and it will work almost as well except for some data type differences with BDE and a bug in returning datetime values which leaves out the time component – but there is a workaround for this issue too.
A main SQL query datamodule
This holds all your SQL statements to access the database via TQuery (if using BDE) or TSQLQuery (if using DBX) components which are linked to the database (make sure you add the database datamodule to this datamodule’s uses clause).
The TQuery component is linked via the databasename property – and I also set the update property to UpWhereKey (but if using reconciliation handling, you may wish to use UpWhereAll – in any case, it should be set the same as your TClientDataset – see below).
The TSQLQuery component is linked to the database via the SQLConnection property.
Once you have written your SQL statement and tested to make sure it works, if it uses parameters, then make sure you go to the params property and set the params Datatype and I usually set ParamType to ptInput.
Then, double click on the component to bring up the field list and right click and select Add All Fields as we generally NEED to use PERSISTENT FIELDS to ensure we can write data back to the database, and to use nice features such as calculated fields and lookup fields.
Set the provide flags for each field to pfInUpdate, and for the key field, also add pfInKey. If you are going to use reconciliation dialogs and handling (which I don’t use as it tends to confuse most end users), then you will also need to add pfInWhere.
In general, you will not need to write ANY code in this datamodule.
A lookup table datamodule
In complex applications, I prefer to place all my lookup tables in their own datamodule,for ease of use.
Detail is as for the main datamodule.
TClientDataset datamodules for each SQL datamodule
I like to have a main clientdataset module and one for my lookup clientdatasets.
If you do this, you will need to add the lookup datamodule to the main datamodule Uses clause, and create it BEFORE the main datamodule.
You will also need to place the relevant SQL datamodule in the Uses clause.
First I place a TDatasetProvider on the datamodule and set the Dataset property to the TQuery or TSQLQuery component and set the UpdateMode to UpWhereKeyOnly (or UpWhereAll if using reconciliation handling).
If you will be writing dynamic SQL statements in your application to parse through this component, you will need to set poAllowCommandText.
Then I place a TClientDataset component and link it to the TDatasetProvider via the ProviderName property (under Linkage tab in D2007).
If params were used, the right click the component and click on Fetch Params to automatically set these.
Then double click the component and create PERSISTENT fields as was done above and set the provider flags as above.
Now you can also create NEW persistent fields such as Internal Calc fields (this can be used for sorting the table by including them in the IndexFieldnames property or filtering the table), calculated fields or lookup fields (the latter two cannot be used in indexes or filters, nor can they be referenced by an Internal Calc field). Obviously, changes in this new fields will NOT be sent back to the database.
If using lookup fields, I add some code in the BEFOREOPEN event of the TClientDataset to ensure the lookup table being referenced is opened first, for example:
with (Dataset as TClientDataset) do
if fieldCount > 0 then
for i := 0 to fieldCount-1 do
if fields[i].LookupDataset <> nil then
with (fields[i].LookupDataset as TDataset) do
if not active then
open;
If using internal calculated fields or calculated fields, you will need to write code in the OnCalcFields event for the ClientDataset to set values for these fields – but remember this event will get called very frequently so avoid long duration code.
Dealing with primary key fields without using auto-incrementing fields
In order to allow use across multiple database types, I prefer to avoid using auto-incrementing fields and instead manage these manually.
This needs to be done on two levels:
- within the TClientDataset so that additional records created prior to applying updates and refreshing the clientdataset does not result in a local key violation – here I use local variables set to a unique negative value within the datamodule.
- on the database itself – there are a variety of techniques one can use, but my preference is on the TDatasetProvider.BeforeUpdateRecord to write code to retrieve the next available key value, and then set the key field’s value to this (see below for some code).
Prevent TClientDataset key violations:
Create a global private variable in the datamodule such as Table1ID_Temp: Integer (or one could use a TClientDataset to manage these if there are a lot of tables);
In the ClientDataset.AfterOpen event set the variable to zero;
In the ClientDataset.OnNewRecord event decrement the variable (so it becomes negative and unique) and set the table’s key field value to this variable;
Manual updating a key field:
TDatasetProvider.BeforeUpdateRecord event:
case UpdateKind of
ukInsert:
begin
newKey := IncrementPrimaryKey(Tablename); //this runs a function which sends SQL to the database to get the max(PrimaryKeyValue) and then increments this
DeltaDS.FieldByName(IndexFieldName).newValue := newkey;
end;
ukDelete: if DeltaDS.FieldByName(IndexFieldName).newValue < 0 //this is a temporary clientdataset index so has not yet been sent to server
then Applied := true; //so skip delete
end;
Minimising server side cursors and locking
In the TDatasetProvider.OnGetData event, write the following code which will close the server cursor once the data is retrieved:
with (Sender as TDatasetProvider).dataset do
if active then close;
What we do NOT want to do is allow the BDE to maintain table cursors, as this will cause problems with many concurrent users.
Try to keep your SQL statements short running and accessing as few tables as possible to reduce chances of deadlocks occurring – don’t forget you can use lookup fields in your TClientdatasets so you may not need to use them in your SQL statements.
Whatever you do, DON’T use TTable components or similar for a server database – they are fine for Paradox, dBase and MS Access databases but NOT for MS SQL Server, Oracle, Sybase, etc.
Avoiding the MS SQL Server issue of only one transaction at a time:
MSSQL OLEDB can only have 1 transaction open per connection and there is a risk ApplyUpdates will fail with error: “Cannot create new transaction because capacity was exceeded” – the workaround when using DBX is to do the following:
TDatasetProvider.BeforeApplyUpdates event:
((Sender as TDatasetProvider).DataSet as TSQLQuery).SQLConnection.CloseDatasets;
Saving edits to the database:
One of the wonderful features of TClientDatasets are that user edits are initially performed on the local data and your design can either allow these edits to be saved to local disk files for delayed updating back to the server, or even allow a user to undo an edit or cancel all edits before they are sent to the server.
A simple method to send the edits back to the server is to place the following code in the TClientDataset.AfterPost event:
with Dataset as TClientDataset do
if changeCount > 0 then
applyUpdates(-1);
Note that the -1 means you are not going to handle any errors.
This same event could be used for AfterDelete and AfterInsert events as well.
You need to ensure the post event is called prior to shutting the application down, so in the TClientDataset.BeforeClose event just add:
with Dataset as TClientDataset do
if Dataset.state in [dsEdit,dsInsert] then Dataset.post;
And in the datamodule.OnDestroy event you could add:
for i := 0 to ComponentCount -1 do
if Components[i] is TClientDataset then
if (Components[i] as TClientDataset).active then
(Components[i] as TClientDataset).close;
Creating audit trails and historical data trails:
Place code in the TDatasetProvider.BeforeUpdateRecord event to look at the DeltaDS to access which data items are being changed before they are sent to the server using code such as (NB. audit1 is a string variable, delim is a char variable to act as a field delimiter in audit1):
for i := 0 to DeltaDS.FieldCount-1 do
if DeltaDS.fields[i].fieldkind = fkData then
if not VarIsEmpty(DeltaDS.fields[i].newvalue) then //if changed
if (DeltaDS.fields[i].dataType = ftTimeStamp) then
audit1 := audit1+ DeltaDS.fields[i].fieldName+delim+SQLTimeStampToStr(‘dd/mm/yyyy hh:nn’,VarToSQLTimeStamp(DeltaDS.fields[i].newvalue))+delim
else audit1 := audit1+ DeltaDS.fields[i].fieldName+delim+VarToStr(DeltaDS.fields[i].newvalue)+delim;
The above covers most of the database management logic, leaving you to deal with the business logic and the user interface:
- data validation eg. using the persistent field’s OnValidation event
- default values eg. using the persistent field’s OnChange event to set values for other fields
- master-detail tables eg. I prefer to not embed these in a TClientDataset but manually open and close the detail perhaps triggered by a TDatasource event which sets the param of the detail clientdataset then opens it.
- user interface – put your datamodule in the Uses clause of your form, then place a TDatasource component on your form and link its dataset property to a TClientDataset. Then you have a multitude of data aware components which will link to the TDatasource component.
Recent Comments