Mendip Data Systems

Database applications for businesses and schools

LogoTransparent

 

Sub UpdateTestString()

 

   Dim sngStart As Single, sngEnd As Single

   

   DoCmd.Hourglass True

 

   'first reset any existing data from previous tests

   CurrentDb.Execute "UPDATE Postcodes SET Postcodes.Accuracy = Null WHERE Postcodes.Accuracy ='OK';"

   

   sngStart = GetCurrentSystemTime

   

   'run test using one of the following:

  '1. Trim

   CurrentDb.Execute "UPDATE Postcodes SET Postcodes.Accuracy = 'OK' WHERE (((Trim([Accuracy] & ''))=''));"

 

  '2. Len

  'CurrentDb.Execute "UPDATE Postcodes SET Postcodes.Accuracy = 'OK' WHERE (((Len([Accuracy] & ''))=0));"

 

   '3. Nz

  ' CurrentDb.Execute "UPDATE Postcodes SET Postcodes.Accuracy = 'OK' WHERE ((Nz(Accuracy,'') =''));"

   

   sngEnd = GetCurrentSystemTime

   

   DoCmd.Hourglass False

   

   MsgBox "Time taken = " & Round((sngEnd - sngStart), 2) & " seconds"

   

End Sub

This was originally written in response to a question by Mister Chips at Utter Access forum:

Best Way To Check For 'nothing' In A Form Control

 

The OP wanted to know the best approach to checking if there is anything at all in a control, be it text, combo etc.

 

Three methods were suggested by the OP / Phil C & myself respectively

a)  Trim e.g. If Trim(txtControl & "") = vbNullString Then

b)  Len e.g. If Len(txtControl.Value & vbNullString) <> 0 Then

c)  Nz e.g. If Nz(txtControl,"") <>"" Then

 

It was strongly suggested by several forum members that Len would be faster as

'VBA handles numbers better then strings especially in comparing, so checking if Len(...) <> 0 is easier for VBA than comparing two strings'

 

I was sceptical so decided to test all 3 methods on a local table with approx 2.6 million records.

In each case a text field was modified where it was null or a zero length string.

These were approximately 20% of the total records

 

I made sure nothing else was running during each test to try & ensure there were no other factors influencing the results. I ran the tests repeatedly for each of the methods but found little variation between each set of results

 

Tests were repeated on several other workstations. In each case, the order was the same

Nz was marginally faster than Len with Trim being the slowest

 

However, the differences were relatively small.

To my mind, this perhaps indicates that it is easy to get too bound up in optimisation worries due to the processing power of most modern computers.

 

I also ran the tests with the Accuracy field used in the test first unindexed & then again after indexing. The outcome was consistently slower for the indexed field – indexing speeds up searches but significantly slows down updates

 

Each result is the average of 3 tests though there was minimal variation in each test

 

Times were measured using the system timer (updated 60 times per second approx.) and rounded off to 2 d.p. (centiseconds)

 

I had intended to compare all 3 approaches looping through a recordset which I knew would be much slower. The first one was indeed VERY SLOW and eventually crashed the database as the file size approached the 2 GB limit. I abandoned the remaining recordset tests!

 

Click to download :  Empty Fields Comparison Tests     Approx 25 MB (zipped)

 

NOTE This is a VERY LARGE download due to the table containing 2.6 million records

 

Code:

Speed Comparison Tests                                                  Page 1 (of 6)

Screenshots

Return to Code Samples Page

Click any image to view a larger version ...

As previously mentioned, system time is updated about 60 times per second - around 0.16 second intervals. So there is some potential error in the values but not as much as the time differences between the methods.

 

However, to me, the results indicate that in real life situations. there is minimal difference in the methods.

 

There are other ways of measuring time with precision

For example, you can use GetTickCount but that is also based on system time

The built in Timer function can be used to give time to centisecond accuracy though with the same limitation

 

You could also use the StopClock class code but this also appears to be millisecond accuracy

 

Option Compare Database

Option Explicit

 

Private Type SYSTEMTIME

   wYear As Integer

   wMonth As Integer

   wDayOfWeek As Integer

   wDay As Integer

   wHour As Integer

   wMinute As Integer

   wSecond As Integer

   wMilliseconds As Integer

End Type

 

Declare Sub GetSystemTime Lib "kernel32" (lpSystemTime As SYSTEMTIME)

 

Function GetCurrentSystemTime() As Single

 

'Use this when accurate time differences to milliseconds are required

 

   Dim tSystem As SYSTEMTIME

   

   GetSystemTime tSystem    

   GetCurrentSystemTime = CDbl(Int(Timer) & "." & tSystem.wMilliseconds)    

   'Debug.Print GetCurrentSystemTime

   

End Function  

The system time code is:

SpeedTest1-Indexed SpeedTest1-NotIndexed SpeedTests1-ResultsPNG

Indexed :

Not indexed :

Total views

1 2 3 Return To Top Page 1 of 6 Return to Access Articles 4 5

This article describes various tests done to compare different approaches to coding:

1. Handling nulls: Trim / Len / Nz                        

2. CurrentDB vs DBEngine(0)(0)                          

3. DoEvents vs DBIdle.RefreshCache                  

4. HAVING vs WHERE                                          

5. Conditional Updates  

6. Query vs SQL vs QueryDef                                      

 

Example databases are provided so the same tests can be done on your own workstations

 

Also see these related articles:  

a) Show Plan - Run Faster

   This article explains how the little documented Jet ShowPlan feature can be used to assist  

   with optimising queries & VBA SQL statements

 

b) Synchronise Data

   This article discusses various ways of synchronising data with external tables. The times are

   compared as well as the increase in file sizes associated with each method

- This page

- Page 2

- Page 3

- Page 4

- Page 5

- Page 6

6

1. Handling nulls: Trim / Len / Nz               Updated 15/01/2019

Difficulty level :   Moderate