Monthly Archives: August 2010

Removing everything other than .svn

After I updated some code, I was downloading it into my local svn repository and planned to do a svn commit.  I thought I had set my FTP settings correctly to merge folders, but alas I didn’t.  What I ended up with was a broken svn working copy.

So I decided to pull another checkout in another directory.  After doing that, I needed a way to just get the .svn folders and its files out.  The quickest method I could think of would be to use my FTP client (Transmit by Panic) and this time merge the folders together.  I am sure there is a better way but I didn’t have much time to waste searching.

To accomplish this task, I needed all other files removed.  So I wrote a function to do this:

—–

function remove_non_svn($dir)
{

$files = scandir($dir);

foreach ($files as $file)
{

if ($file == ‘.’ || $file == ‘..’ || $file == ‘.DS_Store’ || $file == ‘.svn’)

continue;

if (is_dir($file) && $file != ‘.svn’)
{

remove_non_svn($dir . ‘/’ . $file);
rmdir($dir . ‘/’ . $file);

}

else

unlink($dir . ‘/’ . $file);

}

}
—–

Then I just popped that into a script and told it what folder to execute this on and it went to work.  It quickly did the job and got it all cleaned up.  Then I simply used my FTP client to merge the folders into the working copy.  After that a svn status showed the modified copies and was working.

I should note that doing this is dangerous to your svn working copy and could break things if not done right.  There also may be better methods to restore you working copy to working order.  I just didn’t have much time on my hands to search for it.

Read More

Mysql queries using offsets without limits

While working on a project, I came across the need for a script to run a loop through a table and process some commands.  However, due to the size of the table, this surely would take longer than the default 30 seconds that is setup in most configurations.  However, I didn’t want to do any limits.  I wanted my script to determine when it was nearing the timeout and then stop, otherwise try to process more.  So a standard LIMIT in mysql wouldn’t do it.

Much to my surprise, MySQL doesn’t offer a way to just do a OFFSET.  You have to use the LIMIT with OFFSET or none at all.  This was really annoying as I thought I would have to go back to limiting the query size.

Well, then I realized, that this could be solved another way.  I just added a column and populated it with an incremental numbers.  Then I told my script to ORDER BY that column id using ascending.  Now with that in play, I simply just added a WHERE to my query and told it not to do anything below a certain id.  The certain id comes from a variable that is passed from the user and cleaned up (safety first!).  The script after processing the needed commands, updates this variable.  Finally when it is time to pause the script so it doesn’t time out, that variable is sent with the forwarding url.  This method allows the script to pick up where it left off again when it starts up.

Seems like a very simple work around, although if I didn’t have the id column, it wouldn’t of worked.

Read More

Highslide for Wordpress Plugin