a collection of technical fixes and other randon stuff

Spodworld

  • Join Us on Facebook!
  • Follow Us on Twitter!
  • LinkedIn
  • Subcribe to Our RSS Feed

How to get past MSAccess error 'database or project contains a missing or broken reference to the file 'comdlg32.acx' version 1.2'

I came across this error message while trying to open a db/db app developed in MSAccess.
unfortunately the UI would freeze after clicking okay and i couldn't continue.

To get around this I opened the access  file/ .mdb file  in "design mode".

To do this hold down 'shift' while opening the database.

This should hopefully skip any code that has been added that stops the database from opening.

 

Hope that helps you!

VMRun reports could not connect to the virtual machine or permission denied

Problem:

Recently I was creating a c# windows service which interacted with VMware machines.

When trying to start up a VM by calling VMrun.exe  i got the error message

"Unable to connect to host"

I found this by piping the response to a text file.

This is how it was solved:

To get around this I changed the command i was using to not use the GUI (nogui) option.

The only thing you need to consider is that running VMs will not be controllable by your manual login/UI as they are running via different process.

 

Full command line details are available here:

http://www.vmware.com/pdf/vix162_vmrun_command.pdf

SQL select does high number of reads for just one record

If you find a select statement is doing a lot of reads when you only have a few records or one record in the table, this may be down to image data types.

I've found in SQL server i was getting 4000 + reads on a select statement to retrive 1 or 1 records

This is probably down to the image datatype which probably needs several reads across pages to retreive it.

I sped this up by changing one of the selects to not return the image as it was not needed.

 

 

Speeding up SQL server for millions of records

Recently i've been required to insert 6 million database records into the latest version of one of our products.

Actually, it's a lot more than this because it it 6 million business objects which of course tend to have their related lookup tables so the figure is probably much more than this.

The problem I have is that for reliable data, the information needs to be inserted through the interface to guarantee it is like a production environment, and this is slow... especially when you get over 1 million database records in your table.

 

As an experienced dataabase developer, my first thoughts were to look at profiler, and get some information on improving the performance with code tweaks or better indexing. Being an almost mature product, there was little to do here, but a few changes to the indexes and stored procedures helped a bit. This gave be about 50% speed improvement but this didnt last for long.

Soon the process of inserting records, and the backgroung logic doing lookups to check for duplicates etc... was slowing down to 3 or four records per second, which was going to take forever. At 2 million records, it was clear that this needed a different approach.

 

To rub salt in the wound there were other factors stopping me doing my task.

  • I had a brilliant laod testing app, i had written that could reduce call time/latency to the website by doing all this multithreaded. Unfortunately the code wasnt very threadsafe, and performance got worse with multiple threads due to all the locking
  • My super fast servers from last year had been hijacked by IT to run sharepoint
  • I could just run several databases at low volumes and piec ehem all together. THis involved a big manual manipulation of the data (a risk) and also ..i didnt have time. At table sizes of this size , even SQL server's interface was struggling.

 

The solution was based around previous tests where we found a super fast SAN storage system sped up performance significantly. However i didnt have access to one in such a short time.

I decided to improvise  adn create a RAM disc  to store the SQL database on and hopefully speed things up. This in theory should be much faster at seeking reading/writing than a standard mechanical hard drive.

I used a PC with 12GB ram, and created A RAM DRIVE.

I downloaded a trial version of this software: http://www.superspeed.com/servers/ramdisk.php and created an 8GB drive within 5 mins.

I then 'detached' my database using the SQL server interface, and moved the mdf,ldb files to the ram drive.

I then  re-attached the database amking sure sql server had permission to the new files and their storage location.

After a database shrink and index rebuild on all tables i re-started my task.

Insertion continued at 50-100 inserts per second which was much faster than before. And slowdown has stopped or at least is slowing down at a slower rate than before.

 

You can also move the system temp db file to the ramdisk too which can speed things up further, however, bear in mind if you need to do any large data deletion or modification on large numbers of rows, you may find the tempdb grows and fills your ram disk whilst trying to do this. For this reason , i left this out on my final run.

 

Dont forget a large dataabse will probably get several maintenance tasks carried out on it over a growth like this so it may be handy to set an hourly index rebuild task whilst your insertion run is executing.

Annoyingly this should all be ready just before our much more efficient mass insertion feature passes testing.

If you have the resources, i'd try a solid state drive or SAM system as you must remember your dataa will be lost as soon as the power goes off.

Dont forget to back it up once you've finished

Robots.txt , dont forget to fill it in

Primary purpose of robots.txt file on your website is to tell search engines and crawlers what NOT to index.

One common mistake is to not fill it in correctly, this for one can cause google to think the file is invalid.

This usually happens if you think 'i dont want to block this site' so i wont put anything in it.

 

My advice when creating a robots file is to create a rule to block an imaginary folder or file.

eg:

User-agent: *
Disallow: /mynonexistantdirectory/
Disallow: /myloginpage.ashx
Sitemap: http://mywebsiteURL/sitemap.xml

(replacing mywebsiteURL with whatever your site is)

If you have things you actually need to block , then use them instead, just dont leave it blank.

Virtual Machine reverted to snapshot by it's self

Ever had a virtual machine revert to an earlier snapshop by it's self or for no apparent reason?

It happened to us today, and it was partly due to a poorly designed popup message.

In VMware there is an option to tell workstation what to do when you shut down a virtual machine.

The options are:

  • Just power off
  • Revert to snapshot
  • Take a new snapshot
  • Ask me

(see attached image)

If you have this on 'ask me' then you will get a popup message when you shut down the virtual machine.

The problem is , that the options on this pop-up have a very wide area that's sensitive to mouse clicks. The clickable area extends right over to the OK button, which if you're not careful, means that if you miss the Okay button, you select the 'revert to snapshot' option.

 

 

To prevent this from happening (and it will, usually when things are slow and not working well) set the 'Just power off' option in the settigns for each VM.

 

Hope that helps!

Isnull and is null in SQL server

Some time ago , i attended a training course and got chatting to a fellow learner, who was talking about ISNULL.

For a while i was thinking he was saying IS NULL, which confused me , but he explained the difference and , if like me , you never got round to readingthat part of the manual, you'll find it quite usefull.

 

ISNULL effectively is a nicelittle command that  tells SQL server .... "If this vaue is null, then replace it with another value"

whereas...

IS NULL is a comparison for any values that are NULL.

 

THE ISNULL statement is particularly usefull for when you've got data that hasn't been filled in or collected and yet is displayed in a report to an end user.

 

 

 

 

Load testing ate my Data Storage

It's been a while since i created it all, but the IT guy has now sent an allstaff email to request clearing up excess data.

My biggest sin is the reams of Load testing data produced by a very enthusiastic bid to prove our system was scalable.

If you're going to have to generate huge amounts of traffic directed at a database driven site, then prepared to create lots and lots of data.

 

...especially if the client wants proof!

Items they'll probably want recorded and stored are:

  • Response times
  • Throughput
  • Requests per second
  • Web logs proving the activity
  • Performace monitor information
  • SQL analysis data
  • Machine /drone data
  • Number of errors occurred
  • performance verses user info  / ramp up info.

A lot of this information will be duplicated and not entiely necessary, but if the customer wants it, it must be provided.It also adds credibility to your testing and backs up any in-house tools you are using to create the loads.

Web logs are especially storage hungry as they're main purpose is to be produced and stores as efficiently as possible. Zipping these up helps considerably and it's worth having a good zipping tool at hand as your fall back plan.

If you have time it's worth archiving the data in a more efficient storage system. For instance writing a parser to place IIS log entries in a database and to normalise the data. This will save lots of storage space and hours of zipping and unzipping. It will also leave your information retreivable and searchable when youve finished.

 

 

 

Creating your perfect Virtual machine

What i do when creating a virtual machine.....

 

  • Carefully select drive size, if doing lots of rollbacks do you need 20G free or just enough to test your apps?
  • Install OS/windows etc...
  • Get it to the state youe require. 
  • Turn off autoupdates if you dont want it to change in the future.
  • remove old updates files/ clear temp files and recycle bin etc...
  • Install tools you need for testing. notepad ++ is good.
  • Install required browsers etc...
  • Install vmware tools
  • Change admin password to something memorable, but not insecure
  • Set VMware tool to sync the clock with host.
  • Add commonly used shortvuts to your desktop.
  • If you'll edit lots of files, open them once in your favourite editor to ensure the apps are mapped to the tools you use.
  • RUn your required apps at least once, so they get their initialisations out of the way
  • Restrict resource hungy apps.eg:  like SQL server(fix memory usage to prevent excessive hard drive growth)
  • Clear out event logs
  • Set up Internet explorer with trusted sites you use most often.
  • Defrag the hard drive....reboot.... defrag again
  • Shrink the hard drive using VMware tools
  • Reboot
  • Shutdown
  • Ssy prep the VM if your coing to make many copies
  • Take a snapshot

TV guide quirks

Does it annoy anyone else that the up button on a tv remote changes the channel up, in normal view and down when viewing the guide page?

Regards Retards!

The other day , i was thinking what the best way to sign off emails is.

As i hate email signatures and stuff like that, i found a simple inbound emails had the simplest and easiest sign off....

 

Regards,

Joe Bloggs.

 

Perfect until you try and use it yourself. I'm a great person for typos and just as i was about to send a recent email (to lots of co-workers), I checked my sign off

it said.....

 

Retards!

Joe Bloggs.

 

So as a little lesson, it may be worthwhile to automate your sign off and use email signatures, rather than offend.

 

So long,

and kind Retards!

 

Must have developer tools

  • Service capture
  • VM Ware workstation
  • Visual studio
  • Notepad ++
  • Firefox add on - ColorZilla
  • Firefox add on -CSS viewer
  • Firefox add on -Measure it
  • Firefox add on - X-Pather
  • Firefox add on -C-PAth checker
  • Firefox add on -FireFTP
  • Firefox add on - Web Developer
  • Subversion
  • TortoiseSVN
  • Ahnk SVN
  • Open Office
  • SQL server
  • Visual studio Essentials
    This provided some neet tool, one is addition of lorem ipsum text, and you can also click on web page items in IE whilst debugging and be taken straight to the code in the VS editor.
  • Remode desktop connection mangager
  • Fences
  • FileZilla -for FTP
  • WinDIff
    For comparing changes across files
  • Paint.NET
  • IRfanview - really handy for changing image files sizes and batch processing on-mass.

Let me know if you have any other cool add ons or tools that you use regularly.

VMware viewer and windows 2000 screen resolution

A while ago, i retired an old windows 2000 PC.

Before i sent it to the big land fill in the sky, i used VMware converter to snapshot it.

This nicely circumvented the hard drive problems i was having , which meant it no longer took about 30mins to boot.

Unfortunately, using VMware viewer to run it there was no VMware tools installed and the screen resolution was locked.

The screen resolution was fixed to  640x480 as there were no proper display settings configured.

Last night i sat down and finally fixed it....

 

A solution to this is to obtain a copy of VMware tools. Connect a CD drive or iso image using the VMware player controls and install the tools.

Reboot the virtual machine and then re-attempt the process of changing the display settings....

...voila!! they work now!

Job done! can now see a decent sized screen.

 

SQL management interface locked the database and remote logins

Today there was a problem,

The site stopped working, no one could log into the interface, remote desktop onto the server failed / didnt work.....

The problem was someone doing a backup of the database server. They had  remote desktopped into the system and started a backup, half way through an error message popped up and the system halted while SQL server waited for the response.

Every now and then something you do on MS SQL server throws up an erorr message. I'd advise to always act on it in a timely manner, as sometimes it keeps tables in the database locked whilst waiting fro a reply.

Keep focussed and dont do backups during critical times of the day Smile

 

If anyone knows why this happens, i'd love to know so do leave a comment.

Cheers!

Reducing Transaction log growth in MIcrosoft SQL server 2005 /2008

A situation that you may find yourself in when developing or managing a database with high throughput is that the transaction log tends to grow faster than you would like.

 

The easiest way out is to switch the database to use simple logging. But this might not be an option ....for instance, you may need to ensure that you can roll back data and provide full recovery ability.

The second option is to increase the frequency of your full backups and clear your transaction log out more often. This may keep youi running for a little longer, but can be lots more work.

The third option is to look at your Stored Procedure code and reduce unwanted transactions.

Transactions - good or bad

Theres really no argument to say that transactions are bad on the whole. Without them , databases would be as reliable as a Bin-man on strike. But they can be too much of a good thing on busy, data critical dataabses.

A developer will go round ensuring all essential logic is encased in a try-catch and is enclosed in  begin trans / commit trans /rollback transaction statement(s).
This can be bad if all the code is doing is a select statement.
Transactions are only needed when something gets UPDATED/DELETED/INSERTED or the database gets modified in any way.

Step 1: Remove explicitly declared transactions where theyre not needed.

Transactions where you dont expect them or didnt think.

Next, comes a more sneaky culprit. "Temporary tables". These are often added to help process some complex logic or extract data from some XML file.
The act of creating a temporary table and doing insert/update statements creates a transaction for each one.

Step 2: Look for stored procedures who should just be doing select statements and check for temporary tables. If they're being used try to remove them completely or replace them with table variables. These wont modify the dataabase structure and therefore reduce the transactions.

More transactions means less!!!?

Table variabls may not be possible for your solution, or you may not have time to rre-factor your code.

An alternative method may be to re-instate or add explicit transactions to enclose the whole stored procedure and then perform a purposefull Rollback to clear the unneccesary transactions from the log.
This esentially doesnt change your code much , but gives you control over the many unexpected additional transactions that are present in your code.

How to spot them

Use SQL profiler and add "Transaction" events to those you are monitoring. You will find your stored procedures execute but there are many additional entries without any SQL text  displayed. These are the hidden transactions that are being created.

Also use this stored procedure to help find use of temporary tables. It searches the stored procedures for some text.......

-------------------------------------------------------------------

CREATE PROCEDURE [dbo].[Find_Text_In_StoredProcedure]
@StringToSearch varchar(100)
--WITH_ENCRYPTION_REPLACE_ME_FOR_LOCAL--
AS
   SET @StringToSearch = '%' +@StringToSearch + '%'
   SELECT Distinct SO.Name
   FROM sysobjects SO (NOLOCK)
   INNER JOIN syscomments SC (NOLOCK) on SO.Id = SC.ID
   AND SO.Type = 'P'
   AND SC.Text LIKE @stringtosearch
   ORDER BY SO.Name

--------------------------------------------------------------------

How to use SQL profiler (quick guide)

SQL profiler usage

 

SQL profiler is a great tool for finxing problems with your database. It can highlight where errors are occurring, and also to some extent assist with improving performance.
I say to some extent because, once you get running really fast, you need to take into account that profiler does slow down the responses coming from your database. My policy on this is to always test and improve on a slow running machine or virtual machine and then deply the changes to your fast server. Simply throwing more CPU and memory requires you to test with larger volumnes and makes you work harder rather than your tools.

 

·                     First you'll need SQL management studio, either the full version or a developer version.(SQL express wont have this)

·                     Connect to the database using SQL management studio. (You could try remoting to the server and using the management studio thats installed there if it's not on your machine)

·                     SQL profiler can be found in [Start]->[Programs]->[Microsoft SQL 2005]->[Performance Tools] or through the [Tools] menu option in SQL management studio it's self.

·                     On initial startup you are presented with a blank screen. You need to create a new trace.(process whereby data is collected)
Click [File]->[New trace]
Select your database server to connect to (usually the one your working on) Note: you will probably need Admin permission or the correct settings to do this.

·                     Use the standard template if this is your first attemt, but click on the [Events Selection] Tab to have a look what's available.
Here you can add events to be monitored by checking the check boxes. More options are available if you choose "Show all events" and/or "Show all columns".
You can also add filters to narrow down what your looking for by right clicking over a column header and selecting "Edit column filter". I mainly use this to select long running events eg: over 60ms initiallly to filter out neglegable actions.

·                     When you select okay, the trace starts and the events are displayed onscreen.

·                     Pause/Stop/COntinue buttons are available at the top to allow you to brows the captures data once it's come in.

·                     You can use this to gather information on what actions are taking the longest to execute, or... you can save the trace for the Database tuning Advisor to process and make suggestions on inproving performance.
To do this clisk [File]->[Save as]->[trace file].
Once saved go to [Tools]->[Dataabse tuning advisor] and use this to analyse your trace file.

 

IE hangs whilst downloading flash movie

A few weeks ago , i came across a glitch which may catch out you developers who use Firefox a lot for testing your code.

 

I recently added a ASp.net control to display youtube videos. The links came from a database and the customer hadn't got much content. As a result there were several entries that didnt have links. This worked fine in firefox, but Internet explorer hung whilst trying to download the non-existant video file. 

Remember , always test things in both browsers and use default URLS when linking to video or just display something else like a comming soon message.Laughing

 

 

SQL optimisation checklist

  • Place all your code in stored procedures where possible
  • Move declarations  to the start of stored procedures.
  • Are transactions required??? If there are no updates or inserts, then you should remove any transactions and rollback commands. These will just fill your transaction log up.
  • Repeat getdate() commands. Why not decalra a variable at the start of the stored procedure and re-use it later on in the code.
  • Count(*) and count(fieldname). If your deciding if there is data in a table or looping through a numbe rof rows. Use the EXISTS command instead. If a cound is required, just use one field instead of * as is reducess the number of reads.
  • Rollback transactions as soon as possible. Sometimes people build error messages before rolling back... these can be done outside the transaction.
  • XML use... try upgrading to the latest XML methods.
  • Prevent recompiliation
  • Avoid temporary table modifications
  • Reduce multiple lookups into one.
  • Move select statemnets to the start of stored procedures where possible. This reduces record locking times.
  • Modify multiple update statements to use CASE.
  • Reduce UNION commands
  • Get a tool to help you....MS query analyser is pretty valuable.
  • Avoid nested loops
  • Avoid functions in the WHERE clause as this affects performance.
  • Use performance monitors where available.
  • Use a fixed load rather than the maximum load when trying to improve performance
  • Call stored procedures using their fully qualified name
  • Split larger stored procedures into smaller ones and have  acontrolling procedure.
  • Do not use sp_ as a stored procedure prrefix
  • Consider returning the integer value as a RETURN statement instead of returning an integer value as part of a recordset.
  • Avoid temporary tables
  • SET NOCOUNT ON
  • Use the sp_executesql stored procedure instead of the EXECUTE statement.
  • Check the order of ields in the WHERE clause
  • Consider additional indexes on popular search fileds
  • Reduce the data returned (avoid select *)
  • Are ORDER BY and GROUP by required. - these are quite expensive operations
  • INcrease workerthreads
  • Increase CPUs / hard drive speed / Memory
  • Consider offloading work.
  • Use the select statements with TOP keyword or the SET ROWCOUNT
    statement, if you need to return only the first n rows.
  • Clustered indexes are more preferable than nonclustered, if you need
    to select by a range of values or you need to sort results set with
    GROUP BY or ORDER BY.
  • Try to avoid using the DISTINCT clause
  • Use table variables instead of temporary tables
  • Create indexes on XML columns

  • De-normalization

  • Avoid use of NULL