Many of us install server-side (ASP, CGI or PHP) scripts on our web sites, and many of this scripts store data on server. However, poorly designed scripts can experience performance problems and sometimes even data corruption on busy (and not so busy) web sites.
If you're not a programmer, why should this matter to you?
Answer: Even if you're just installing and using server-side scripts, you'll want to make sure that scripts that you choose don't randomly break or corrupt your data.
First, some examples of types of scripts which store data on web servers include:
(Of course, many scripts in each of these (and other) categories are well-designed, and run perfectly well even on very busy web sites).
1. Follow-up autoresponders typically store list of subscribers to autoresponder, as well where in sequence of messages, each subscriber is. Examples of autoresponder scripts: http://www.scriptcavern.com/scr_email_auto.php
2. Classified ad scripts store (at least) a list of all classified ads placed by visitors. Examples of this type of script: http://www.scriptcavern.com/scr_classified.php
3. Free for all links scripts store a list of all links posted by visitors. See some example scripts listed at: http://www.scriptcavern.com/scr_ffa.php
4. Top site scripts usually store a list of members of top site as well as information about number of "votes" that each has received. For examples of this type of script, see http://www.scriptcavern.com/scr_topsite.php
So what kind of scripts have problems? And what sort of problems am I talking about?
Well principle problems all relate to what happens when bits of data from multiple users needs to be stored on updated at same time. Some scripts handle these situations well, but others don't...
Here's a common data corruption problem that can occur with many scripts:
1. When some bit of data needs to be updated, a copy of server-side script starts running, and then starts updating it.
2. If another user comes along and does an update before first copy of script has finished, a second copy of script starts running at same time.
3. There are a number of ways things can now go wrong, for example:
(a) What if first copy of script reads in data, then second copy reads same data, then first copy updates data, then second copy updates data? Answer: any changes made by first copy of script can get lost.
(b) What if first and second copy of scripts are both adding multiple bits of new data to store at same time? For example, imagine each needs to store headline, description and name of person posting a classified ad. Well, what can happen (with some scripts) is two classified ads can get intermingled, so you might get (for example) HEADLINE-1, DESCRIPTION-1, HEADLINE-2, PERSON-1, DESCRIPTION-2, PERSON-2. Or worse yet, you might get bits of each part of each classified ad, mixed with bits of other. This type of thing is usually really bad news, as your data may consequently becoming unusable from that point on.
Does this sound too unlikely a problem to worry about? Don't bank on it... even if it happens only 1 time in 1,000, or 1 in 10,000, eventually it will happen: You need a solution.
So real question is: is it possible for programmers to create scripts without these kinds of problems? Fortunately answer is yes, and there are a number of ways that programmers can address it:
1. They can store each bit of data in a separate file. This isn't necessarily a total solution by itself (in particular, a script which just does this could still have problems if multiple copies of a script update same file at same time), but it does make data corruption less likely, and if corruption does occur, at least it won't corrupt entire data store in one go.
2. They can use file-locking. This means that if one copy of a script is working with a file, another copy of script is prevented from working on that file, until first copy has finished. File-locking works if done correctly, but programming it into a script needs to be done very carefully and precisely, for every single possible case... even a tiny bug or omission can allow possibility of data-corruption in through backdoor!
3. They can use a database (such as MySQL) to store data. Provided data is properly structured in database, database handles locking automatically. And, as programmer doesn't have to write their own special locking routines, possibility of bugs and omissions are much reduced.