a collection of technical fixes and other randon stuff


  • Join Us on Facebook!
  • Follow Us on Twitter!
  • LinkedIn
  • Subcribe to Our RSS Feed

How to debug Seed data in MVC Entity framework project

As database migrations and seeding data becomes more code based and more complex , you may find you have problems debugging some of this code.
For instance if you are using package manager to run the update-database command, you may get the following errors:

c# - Validation failed for one or more entities while saving changes to SQL Server Database

Debugging Package Manager Console Update-Database Seed Method

A first chance exception of type 'System.Data.Entity.Validation.DbEntityValidationException' occurred in EntityFramework.dll

If this is the case, i found the following solution helpful....

  1. Open up another version of Visual Studio
  2. In the new instance of visual studio: open up the file for editing and insert a break point where you suspect a problem is occurring.
  3. Go to the Debug menu and use the attach to process method, to attach to your existing copy of Visual Studio.
  4. In your originalinstance of visual studio go to the package manager console and run "update-database".
  5. When your seed data code is run, you will now hit a break point.

How to fix error: "The parameters dictionary contains a null entry for parameter 'id' of non-nullable type 'System.Int32' for method 'System.Web.Mvc.ActionResult"


When navigating to an MCV web page/action/controller, or when you launch your project for debugging, you may see this error displayed in your browser:

The parameters dictionary contains a null entry for parameter 'id' of non-nullable type 'System.Int32' for method 'System.Web.Mvc.ActionResult in .........
An optional parameter must be a reference type, a nullable type, or be declared as an optional parameter.
Parameter name: parameters

This is because the method of your MVC controller is expecting a parameter which hasn't been supplied.
If this is happening when in your application, maybe you forgot to pass the required parameters into the controller's method.

If you are launching from visual studio/debugging, then this may just be the case that Visual Studio has chosen to launch the page you are currently working on without parameters. Just navigate to your home page or know good starting point to get round this.

Unable to get property 'call' of undefined or null reference

Whilst testing an existing page we had developed, i came across this error message.

Unable to get property 'call' of undefined or null reference

This occurred on one of our MVC pages that use the telerik grid controls for MVC.

I used F12 debugger or Firebug to check the network activity and the files downloaded.

I found jquery.validate.min.js was being attached twice.

One was a straight forward script that was being bundled with the site.
The other instance was added by the Telerik ScriptRegistrar.
To prevent this second copy being used , i changed the line of code for the script registrar as follows:




C# How to use an Enum with the switch statemnent

When trying to use enums combined with a switch statement, you may get the follwing error:

"A constant value is expected"

To get around this , try parsing the value so it becomes strongly typed. Using...


Then use the switch with the new strongly typed variable.


var myStat =(RecordStatus) Enum.Parse(typeof(RecordStatus), thisPurchase.Status);
switch (myStat)
                case  RecordStatus.Closed:
                    revokeAllowed = false;
                    revokeDenialReason = "blah!";
                case RecordStatus.ProcessedAwaitingIssueOrNotice:
                    //Should be okay
                case RecordStatus.Issued:
                    revokeAllowed = false;
                    revokeDenialReason = "blah";
                case RecordStatus.ProcessedNoFurtherAction:
                    //Should be okay

Example of a Json request to an MVC controller using Jquery

When trying to do an ajax request to an MVC controller for some json data, I found there are a lot of incomplete examples and different ways of doing things, so here is my take on it.

Firstly, you need to create an ActionResult (or JsonResult) in your controller like this:

public ActionResult DoesClientHaveAccount(int clientId)


public JsonResult DoesClientHaveAccount(int clientId)

Returning the data required you to convert to a Json object. A simple way to do this is with the Json function as follows.
public JsonResult DoesClientHaveAccount(int clientId)
return Json(new { success = false, Message = "Account found with no address!", noOfAccounts = 1 }, JsonRequestBehavior.AllowGet);

I've deliberately made this a little more complicated to show how we pass multiple values back to the calling script.

Note the statement...


Without this you may get calls to your controller returning an Internal server error message.
This is because without it , you wont have permission to call the Action.

Another important parameter is the content Type. Without it, sometimes you may find your response  at the client side appears as
[Object] [object]

This sometimes occurs on IIS if you dont have the Mime types set for Json/Ajax etc... It can also occur when developing with Visual studio's built in web server.

To fix this add the Content Type as follows:
 return Json(new { success = false, Message = "Account found with no address!", noOfAccounts = 1 },"text/text", JsonRequestBehavior.AllowGet);
when calling this from the javascript code on the client browser, an example is as follows:

            type: "POST",
            url: "/Account/DoesClientHaveAccount/",
            data: { "clientID": clientId },
            traditional: true,
            async: false,
            success: function (returndata) {
                success = returndata.success;
                if (true == success) {
                    hasAccount = true;
                    var id_addCombo = arr[0].Value;
                    var myName = arr[0].Text;
                    var myAddress = id_addCombo.split("$$")[1]
                    var myID = id_addCombo.split("$$")[0];
                else {
                    hasAccount = false;
            error: function (xhr, textStatus, error) {
                alert("There was an error while checking for a matching account: " + error);

It's always good to have two levels of error checking.
The success: and error:     events   are for triggering actions when all is well and handling errors when there is a serious comms error.

In this example, i pass back a success flag to indication if any of our server side logic detected an error.
This example is based on the Jquery .ajax call. To use this, yuo will need to include the JQuery script files in your application.

What the f**k is a 'Delegate' in programming?

I have rarely come across a good article on delegates, and after 20+ years in programming I'll have a go at explaining it myself.

I've often struggled to understand the concept, it's use, benefits whilst actually having implemented them quite well.

Q: "...so what the fuck are delegates?"

"A delegate is a type that represents references to methods with a particular parameter list and return type. When you instantiate a delegate, you can associate its instance with any method with a compatible signature and return type. You can invoke (or call) the method through the delegate instance"

Q: "...so what the fuck are delegates?"


...Okay... you get my point!!!??? When you explain it in programming language, it kind of loses it's practical meaning (unless you eat, sleep, breath coding.)

Q: "...so what the fuck are delegates?"


Well firstly, it helps if you are familiar with the concepts behind: methods,events and reflection and multi-threading.
You only need a little knowledge from each area to follow this article further, namely:

  • You should be aware how to write a method or function. If not go away now, why the hell are you reading this article?
  • You should know that,  with little code, some pretty clever things can be done using reflection, and you can manipulate objects based on what information your code knows about it's type and properties.
  • You should know multithreading is a way of firing off lots of processes at once to hopefully speed things up.


Lets go back to the answer i copied from wikipedia earlier and give an example.

say we have two functions:

string HowExplosive(string animalName, string vegetableName)


string HowTasty(string animalName, string vegetableName)

Both these functions take 2 strings as parameters and return a string as their output value.
They have the same characteristics, so we could describe all methods that have these characteristics by giving them a type.

Let's give them a type of DumbAssSillyFunctions. Lets code it!.....

public delegate int DumbAssSillyFunctions(string animalName, int vegetableName);

and there we have it..... a delegate....it tells us all DumbAssSillyFunctions
take 2 strings , one for animal name and one for vegetable name. And they also return a string.

Having established a way of classifying methods with this delegate/type we've got ourselves some information about our code. Information about code is how reflection works so you should be thinking that we have an opportunity here to reduce code and handle multiple situations.

You should also know that events are very similar to functions and so could also be described using a delegate statement, which often happens.

Are you understanding yet?  Now why did i mention reflection earlier????

Consider this.... your program is receiving inputs of different combinations of animals and vegetables..

Cow, Carrot
Sheep, Turnip
Lamb, mint



Lets take this to the extreme...suppose we wanted to analyse every vitamin and mineral content for each combination... We could hard code all the method calls....or....... we could use reflection to loop through all methods available and find the methods that are of type


When we find a function that matches, we know to pass the animal in first and the vegetable in second, and we will get a result.
This lets our loop execute without knowing anything about the function names or how many there are.
....so we can add more features without breaking or having to modify the main loop.

This is an example of one routine calling many, and we can see how this can be used to launch multiple threads in a multi-threaded application.

Now lets reverse it and see how we can have multiple things calling one method to implement code re-use. Suppose we have some "FoodComboAnalyser" objects.
If we gave all these events of a delegate type "CrazyAlert", one of these may fire if the animal an vegetable are too explosive. The other may fire if it is particularly tasty.
We may want to log these down or show a message when they occur.

We can do this by assigning a handler to the events. The handler will also have a matching delegate type. Once linked, the event handler catches the events and carries out the actions on the information passed to it.
This is how a lot of visual programming languages work by handling onclick, onfocus, on keypress etc...

Continuing the handling of events... an extreme example of event management is the Node.js framework, and i recommend the "codeschool.com" tutorial on it.
Here they explain the event loop which constantly listens for events and handles them by passing the data onto processing functions.

Watch it and you may note, that it is a pretty impressive way of handing out work. I likened it to a super-duper Project Manager handing out lots of work to other people...

or delegating ...to delegates...

Has it clicked yet? :)




Solving Error Unable to cast object of type '<>f__AnonymousType1`1[myType]' to type myType'.


When coding in MVC , i Came across this or similar error message:

"Unable to cast object of type '<>f__AnonymousType1`1[myType]' to type myType"


I had been getting used to wrapping parameters in new{} when passing data from the controller to the view.
This error message was saying, i can't convert you class to your class. The new{} statement wrapping my object was changing it to an 'anonymous type'. So how did i fix it?




I found that removing the new{} section of the code solved this for me.

Schema Compare for Oracle error: "The given key was not present in the dictionary"

When trying to use Schema compare for Oracle, the following message appeared and comparison failed with no detailed information...

"The given key was not present in the dictionary"




There is not much detail on the issue , but to work around this, carefully watch the progress as the comparison is running..

Then look at the options for your project, there may be a switch to ignore the item it was last doing before it fell over.
Try this or a similar /related option, or just try a few to see if it works and narrow it down.

I found on this occasion, the 'Ignore permission' option allowed me to continue, but i would love to know which object was causing this error.

You may have similar issues on other comparison groups. Hopefully this technique will help provide a workaround if a cure is not possible.


For further information, you may want to try enabling verbose logging see this link on instructions....



How to detect and stop long running processes in Oracle


How to detect and kill long running processes in Oracle




To get a list of long running processes/sessions:




SELECT sid,opname,sofar,totalwork,units,elapsed_seconds,time_remaining

FROM v$session_longops

 --WHERE sofar != totalwork;



To view all sessions:




select *
  v$session s


To kill a session:



 (where sid and serial no are taken from the data shown in the script above.




Running a published command line executable as a Scheduled task in Windows 7

If you experience any of the following errors while trying to run a scheduled task:

  • Task Scheduler failed to start instance "{xxxxxxxx-8xxxa43-xxx-xxxx-xxxxxxxxxx}" of "myprogram"  task for user "mydomain\myuser" . Additional Data: Error Value: 2147942593.

It may be that you are using publishing in visual studio to deploy this.

How to fix it:

First of all , once published you need to run the myapp.application file for it to be installed.

Then you need to consider that this installation puts the application elsewhere. If you dont need auto updating, then teh best bet is to point your scheduled task to the installed location.

Typically this would be in...


Where the user is the same as that you installed it/will be runnning it and teh xxxxxx is some random crap.

Best way to find this is to search for a file in your app from  C:\Users\myuser\AppData\Local\Apps\2.0 and then right click on the file and select 'open location'

This solution worked for me, however our solution did not require auto updating which some of you might need.

Any advice on getting the main .application file call would be great. I suspect it will permission based problems.


VMRun reports could not connect to the virtual machine or permission denied


Recently I was creating a c# windows service which interacted with VMware machines.

When trying to start up a VM by calling VMrun.exe  i got the error message

"Unable to connect to host"

I found this by piping the response to a text file.

This is how it was solved:

To get around this I changed the command i was using to not use the GUI (nogui) option.

The only thing you need to consider is that running VMs will not be controllable by your manual login/UI as they are running via different process.


Full command line details are available here:


SQL select does high number of reads for just one record

If you find a select statement is doing a lot of reads when you only have a few records or one record in the table, this may be down to image data types.

I've found in SQL server i was getting 4000 + reads on a select statement to retrive 1 or 1 records

This is probably down to the image datatype which probably needs several reads across pages to retreive it.

I sped this up by changing one of the selects to not return the image as it was not needed.



Speeding up SQL server for millions of records

Recently i've been required to insert 6 million database records into the latest version of one of our products.

Actually, it's a lot more than this because it it 6 million business objects which of course tend to have their related lookup tables so the figure is probably much more than this.

The problem I have is that for reliable data, the information needs to be inserted through the interface to guarantee it is like a production environment, and this is slow... especially when you get over 1 million database records in your table.


As an experienced dataabase developer, my first thoughts were to look at profiler, and get some information on improving the performance with code tweaks or better indexing. Being an almost mature product, there was little to do here, but a few changes to the indexes and stored procedures helped a bit. This gave be about 50% speed improvement but this didnt last for long.

Soon the process of inserting records, and the backgroung logic doing lookups to check for duplicates etc... was slowing down to 3 or four records per second, which was going to take forever. At 2 million records, it was clear that this needed a different approach.


To rub salt in the wound there were other factors stopping me doing my task.

  • I had a brilliant laod testing app, i had written that could reduce call time/latency to the website by doing all this multithreaded. Unfortunately the code wasnt very threadsafe, and performance got worse with multiple threads due to all the locking
  • My super fast servers from last year had been hijacked by IT to run sharepoint
  • I could just run several databases at low volumes and piec ehem all together. THis involved a big manual manipulation of the data (a risk) and also ..i didnt have time. At table sizes of this size , even SQL server's interface was struggling.


The solution was based around previous tests where we found a super fast SAN storage system sped up performance significantly. However i didnt have access to one in such a short time.

I decided to improvise  adn create a RAM disc  to store the SQL database on and hopefully speed things up. This in theory should be much faster at seeking reading/writing than a standard mechanical hard drive.

I used a PC with 12GB ram, and created A RAM DRIVE.

I downloaded a trial version of this software: http://www.superspeed.com/servers/ramdisk.php and created an 8GB drive within 5 mins.

I then 'detached' my database using the SQL server interface, and moved the mdf,ldb files to the ram drive.

I then  re-attached the database amking sure sql server had permission to the new files and their storage location.

After a database shrink and index rebuild on all tables i re-started my task.

Insertion continued at 50-100 inserts per second which was much faster than before. And slowdown has stopped or at least is slowing down at a slower rate than before.


You can also move the system temp db file to the ramdisk too which can speed things up further, however, bear in mind if you need to do any large data deletion or modification on large numbers of rows, you may find the tempdb grows and fills your ram disk whilst trying to do this. For this reason , i left this out on my final run.


Dont forget a large dataabse will probably get several maintenance tasks carried out on it over a growth like this so it may be handy to set an hourly index rebuild task whilst your insertion run is executing.

Annoyingly this should all be ready just before our much more efficient mass insertion feature passes testing.

If you have the resources, i'd try a solid state drive or SAM system as you must remember your dataa will be lost as soon as the power goes off.

Dont forget to back it up once you've finished

Isnull and is null in SQL server

Some time ago , i attended a training course and got chatting to a fellow learner, who was talking about ISNULL.

For a while i was thinking he was saying IS NULL, which confused me , but he explained the difference and , if like me , you never got round to readingthat part of the manual, you'll find it quite usefull.


ISNULL effectively is a nicelittle command that  tells SQL server .... "If this vaue is null, then replace it with another value"


IS NULL is a comparison for any values that are NULL.


THE ISNULL statement is particularly usefull for when you've got data that hasn't been filled in or collected and yet is displayed in a report to an end user.





Load testing ate my Data Storage

It's been a while since i created it all, but the IT guy has now sent an allstaff email to request clearing up excess data.

My biggest sin is the reams of Load testing data produced by a very enthusiastic bid to prove our system was scalable.

If you're going to have to generate huge amounts of traffic directed at a database driven site, then prepared to create lots and lots of data.


...especially if the client wants proof!

Items they'll probably want recorded and stored are:

  • Response times
  • Throughput
  • Requests per second
  • Web logs proving the activity
  • Performace monitor information
  • SQL analysis data
  • Machine /drone data
  • Number of errors occurred
  • performance verses user info  / ramp up info.

A lot of this information will be duplicated and not entiely necessary, but if the customer wants it, it must be provided.It also adds credibility to your testing and backs up any in-house tools you are using to create the loads.

Web logs are especially storage hungry as they're main purpose is to be produced and stores as efficiently as possible. Zipping these up helps considerably and it's worth having a good zipping tool at hand as your fall back plan.

If you have time it's worth archiving the data in a more efficient storage system. For instance writing a parser to place IIS log entries in a database and to normalise the data. This will save lots of storage space and hours of zipping and unzipping. It will also leave your information retreivable and searchable when youve finished.




Must have developer tools

  • Service capture
  • VM Ware workstation
  • Visual studio
  • Notepad ++
  • Firefox add on - ColorZilla
  • Firefox add on -CSS viewer
  • Firefox add on -Measure it
  • Firefox add on - X-Pather
  • Firefox add on -C-PAth checker
  • Firefox add on -FireFTP
  • Firefox add on - Web Developer
  • Subversion
  • TortoiseSVN
  • Ahnk SVN
  • Open Office
  • SQL server
  • Visual studio Essentials
    This provided some neet tool, one is addition of lorem ipsum text, and you can also click on web page items in IE whilst debugging and be taken straight to the code in the VS editor.
  • Remode desktop connection mangager
  • Fences
  • FileZilla -for FTP
  • WinDIff
    For comparing changes across files
  • Paint.NET
  • IRfanview - really handy for changing image files sizes and batch processing on-mass.

Let me know if you have any other cool add ons or tools that you use regularly.

Reducing Transaction log growth in MIcrosoft SQL server 2005 /2008

A situation that you may find yourself in when developing or managing a database with high throughput is that the transaction log tends to grow faster than you would like.


The easiest way out is to switch the database to use simple logging. But this might not be an option ....for instance, you may need to ensure that you can roll back data and provide full recovery ability.

The second option is to increase the frequency of your full backups and clear your transaction log out more often. This may keep youi running for a little longer, but can be lots more work.

The third option is to look at your Stored Procedure code and reduce unwanted transactions.

Transactions - good or bad

Theres really no argument to say that transactions are bad on the whole. Without them , databases would be as reliable as a Bin-man on strike. But they can be too much of a good thing on busy, data critical dataabses.

A developer will go round ensuring all essential logic is encased in a try-catch and is enclosed in  begin trans / commit trans /rollback transaction statement(s).
This can be bad if all the code is doing is a select statement.
Transactions are only needed when something gets UPDATED/DELETED/INSERTED or the database gets modified in any way.

Step 1: Remove explicitly declared transactions where theyre not needed.

Transactions where you dont expect them or didnt think.

Next, comes a more sneaky culprit. "Temporary tables". These are often added to help process some complex logic or extract data from some XML file.
The act of creating a temporary table and doing insert/update statements creates a transaction for each one.

Step 2: Look for stored procedures who should just be doing select statements and check for temporary tables. If they're being used try to remove them completely or replace them with table variables. These wont modify the dataabase structure and therefore reduce the transactions.

More transactions means less!!!?

Table variabls may not be possible for your solution, or you may not have time to rre-factor your code.

An alternative method may be to re-instate or add explicit transactions to enclose the whole stored procedure and then perform a purposefull Rollback to clear the unneccesary transactions from the log.
This esentially doesnt change your code much , but gives you control over the many unexpected additional transactions that are present in your code.

How to spot them

Use SQL profiler and add "Transaction" events to those you are monitoring. You will find your stored procedures execute but there are many additional entries without any SQL text  displayed. These are the hidden transactions that are being created.

Also use this stored procedure to help find use of temporary tables. It searches the stored procedures for some text.......


CREATE PROCEDURE [dbo].[Find_Text_In_StoredProcedure]
@StringToSearch varchar(100)
   SET @StringToSearch = '%' +@StringToSearch + '%'
   SELECT Distinct SO.Name
   FROM sysobjects SO (NOLOCK)
   INNER JOIN syscomments SC (NOLOCK) on SO.Id = SC.ID
   AND SO.Type = 'P'
   AND SC.Text LIKE @stringtosearch


How to use SQL profiler (quick guide)

SQL profiler usage


SQL profiler is a great tool for finxing problems with your database. It can highlight where errors are occurring, and also to some extent assist with improving performance.
I say to some extent because, once you get running really fast, you need to take into account that profiler does slow down the responses coming from your database. My policy on this is to always test and improve on a slow running machine or virtual machine and then deply the changes to your fast server. Simply throwing more CPU and memory requires you to test with larger volumnes and makes you work harder rather than your tools.


·                     First you'll need SQL management studio, either the full version or a developer version.(SQL express wont have this)

·                     Connect to the database using SQL management studio. (You could try remoting to the server and using the management studio thats installed there if it's not on your machine)

·                     SQL profiler can be found in [Start]->[Programs]->[Microsoft SQL 2005]->[Performance Tools] or through the [Tools] menu option in SQL management studio it's self.

·                     On initial startup you are presented with a blank screen. You need to create a new trace.(process whereby data is collected)
Click [File]->[New trace]
Select your database server to connect to (usually the one your working on) Note: you will probably need Admin permission or the correct settings to do this.

·                     Use the standard template if this is your first attemt, but click on the [Events Selection] Tab to have a look what's available.
Here you can add events to be monitored by checking the check boxes. More options are available if you choose "Show all events" and/or "Show all columns".
You can also add filters to narrow down what your looking for by right clicking over a column header and selecting "Edit column filter". I mainly use this to select long running events eg: over 60ms initiallly to filter out neglegable actions.

·                     When you select okay, the trace starts and the events are displayed onscreen.

·                     Pause/Stop/COntinue buttons are available at the top to allow you to brows the captures data once it's come in.

·                     You can use this to gather information on what actions are taking the longest to execute, or... you can save the trace for the Database tuning Advisor to process and make suggestions on inproving performance.
To do this clisk [File]->[Save as]->[trace file].
Once saved go to [Tools]->[Dataabse tuning advisor] and use this to analyse your trace file.


SQL optimisation checklist

  • Place all your code in stored procedures where possible
  • Move declarations  to the start of stored procedures.
  • Are transactions required??? If there are no updates or inserts, then you should remove any transactions and rollback commands. These will just fill your transaction log up.
  • Repeat getdate() commands. Why not decalra a variable at the start of the stored procedure and re-use it later on in the code.
  • Count(*) and count(fieldname). If your deciding if there is data in a table or looping through a numbe rof rows. Use the EXISTS command instead. If a cound is required, just use one field instead of * as is reducess the number of reads.
  • Rollback transactions as soon as possible. Sometimes people build error messages before rolling back... these can be done outside the transaction.
  • XML use... try upgrading to the latest XML methods.
  • Prevent recompiliation
  • Avoid temporary table modifications
  • Reduce multiple lookups into one.
  • Move select statemnets to the start of stored procedures where possible. This reduces record locking times.
  • Modify multiple update statements to use CASE.
  • Reduce UNION commands
  • Get a tool to help you....MS query analyser is pretty valuable.
  • Avoid nested loops
  • Avoid functions in the WHERE clause as this affects performance.
  • Use performance monitors where available.
  • Use a fixed load rather than the maximum load when trying to improve performance
  • Call stored procedures using their fully qualified name
  • Split larger stored procedures into smaller ones and have  acontrolling procedure.
  • Do not use sp_ as a stored procedure prrefix
  • Consider returning the integer value as a RETURN statement instead of returning an integer value as part of a recordset.
  • Avoid temporary tables
  • Use the sp_executesql stored procedure instead of the EXECUTE statement.
  • Check the order of ields in the WHERE clause
  • Consider additional indexes on popular search fileds
  • Reduce the data returned (avoid select *)
  • Are ORDER BY and GROUP by required. - these are quite expensive operations
  • INcrease workerthreads
  • Increase CPUs / hard drive speed / Memory
  • Consider offloading work.
  • Use the select statements with TOP keyword or the SET ROWCOUNT
    statement, if you need to return only the first n rows.
  • Clustered indexes are more preferable than nonclustered, if you need
    to select by a range of values or you need to sort results set with
  • Try to avoid using the DISTINCT clause
  • Use table variables instead of temporary tables
  • Create indexes on XML columns

  • De-normalization

  • Avoid use of NULL