Get your macro on

February 7, 2010

I’ve been trying to get a Macro to work that Torley Linden had recommended in the past. In it, Torley sets up a macro to copy the parcel name and description and paste it into an email snapshot. The code for the macro didn’t work for me, and I finally found a way to fix the macro. The next question I have is if there is a way to open the “About Land” dialog with a key stroke.

{#CTRL -chars a}{#CTRL -chars c}{#CTRL {#SHIFT -chars S}}{#sleep 1500}{#TAB -count 2}{#ENTER}{#TAB -count 2}{#CTRL -chars v}{#CTRL {#TAB}}{#TAB -count 2}{#CTRL -chars a}{#CTRL -chars c}{#CTRL -chars w}{#TAB}{#CTRL -chars v}{#SHIFT {#TAB -count 3}};;;;;{#TAB -count 5}

posted by Dedric Mauriac on Here using a blogHUD : [blogHUD permalink]

I Love You, Cron

January 9, 2010

My gosh, could it be? I believe I may have finally solved all of my problems with the cron manager. This isn’t the first time I thought this way, but I’ve just been through way too much.

I was able to create a bash file to execute a PHP page 20 times, in 15 second intervals. The bash file itself is executed every five minutes through cronman. So, here is the script that I created. It’s odd, but it works.

while [ “$i” != “20” ]; do
i=`expr $i + 1`;
/web/cgi-bin/php5 “$HOME/html/event.php” >> /dev/null;
sleep 15;

I wasn’t able to get a proper “for” loop running. I found out how to do a simlar loop using a “while” statement. Incrimenting the variable “i” was odd. I’m used to “i++”. With this script, I feel like I had to jump through hoops to get an odd expression to add 1 to i. You know what? It works. I am not sure why I am getting picky over its’ perfection.

I’m not getting any emails from normal operations either -which is what I like. I don’t need to be getting emails every five minutes when a job completes without error.

posted by Dedric Mauriac on Applewood using a blogHUD : [blogHUD permalink]

The cron strikes back

January 9, 2010

I’m still finding that I have mail from cron each time the job executes. I believe I may have found the answer this time. I was originally writing the output of the last command to a null pointer. I went ahead and made sure that all commands calling php5 are now writing to nothing.

This then left me with just one error in my mails. Something about an unterminated string value. I couldn’t find any strings that were not closed. I went ahead and found that I could remove the quotes everywhere since I wasn’t using any spaces. When I did this, I found that it was now complaining about an unknown sle command. AH HA! Apparently there is a limit to the length of my commands

I started scratching my head and started to look around. I tried to look for a way to use batch scripts at first, but it looked like I needed to play around with chmod and grant execute permissions on a text file. I noticed some examples used logic to loop through files. I hunted for a way to perhaps throw in some logic for my commands. It wasn’t long before I found a beginners guide to Linux Shell Scripting which helped demonstrate how to use the “for” command. It is very similar to how I program in JavaScript, C#, PHP, LSL, JAVA, etc. Here is one of my tests that I ran directly from the command line:

for((i = 0; i < 5; i++)) do echo $i; sleep 1; done

Using a loop, I was able to get 39 commands trimmed down to about 10 percent of their initial size.

It’s not always a perfect world though. As I find things that appear to work, I often find that they don’t work as expected. Even though my tests worked directly on the command line, I’m still getting feedback from the cron manager about problems. The latest one is “/bin/sh: Syntax error: Bad for loop variable”.

This latest error indicates that there are different interpretors that don’t understan the script in the same way as others. The crontab appears to run things in a different context then the command line itself. I’m running back looking at batch files again (shell scripts), because the first line allows you to identify which interpretor to use when processing the scripts. It’s plenty of techno babel I suppose, but I’m determined to find a working, proper solution to this problem.

posted by Dedric Mauriac on Applewood using a blogHUD : [blogHUD permalink]

Getting past Cron Manager constraints

January 9, 2010

I started to look at the Cron Manager that offers for it’s Linux shared hosting accounts. It’s a web-based interface (hosting control center v2.10.0) rather than dealing with crontab directly. I quickly discovered some problems that I needed to work around. First, I can only have a maximum of 10 jobs. The next issue is that they constrain the minute that the job runs. It has to be a multiple of five. My original need was to have a job run every minute, and use sleep commands to run some php code every 15 seconds.

I found that they offer an advanced form which enabled me to run the job twice in an hour, so if I choose to run a job at 5 minutes past the hour, it will also run at 35 minutes past. With this, I figured I could make six jobs to execute a PHP page at 0, 5, 10, 15, 20, 25, 30, 35, 40, 45, 50, and 55 minutes past the hour. Ok, this was a start – but not good enough. I need the site to execute events every 15 seconds at most.

I had a problem with the limitations of crontab being executed by a minute at most. Looking at Chris Stephens suggestion about the sleep command, I noticed that I could run multiple commands with one job simply by separating them by semicolons. I could pull this off by having my jobs to continuously poll the php page every 15 seconds for five minutes. It was a long shot, but I figured I would see if it would work. The manager accepted my long list of commands. I then made some changes to my code and was able to confirm that my database was being updated every 15 seconds. For those of you who are interested, here is my command. It is 20 calls to a PHP page delimited with 19 sleep commands.

/web/cgi-bin/php5 “$HOME/html/event.php”; sleep 15; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 15; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 15; /web/cgi- bin/php5 “$HOME/html/event.php”; sleep 15; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 15; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 15; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 15; /web/cgi- bin/php5 “$HOME/html/event.php”; sleep 15; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 15; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 15; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 15; /web/cgi- bin/php5 “$HOME/html/event.php”; sleep 15; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 15; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 15; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 15; /web/cgi- bin/php5 “$HOME/html/event.php”; sleep 15; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 15; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 15; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 15; /web/cgi- bin/php5 “$HOME/html/event.php”

There are a few problems with this setup though. A PHP call may not take 0 seconds, so the current job could eventually overlap into the next job. The next is that if the job fails for some reason, the next call to a PHP page will not occur for (at most) five minutes. That could be a long wait for some time senstive events.

I could have used 10 jobs in total and attempted to execute every 3 minutes, but the cron manager limits me to choose my commands to execute every 5 minutes (not 3). However, there would be a way to use all 10 jobs by staggering. In all, I need 120 calls to the PHP page every 30 minutes (30 minutes divided by 15 = 120). Ten jobs would only need to call the page 12 times for the duration of the job. The delay between each PHP call would be two and a half minutes (30 minutes divided by 12 = 150 seconds = 2:30). So with staggaring, I could simply insert a sleep event first to delay the job from starting by 15, 30, 45, 60, 75, etc. seconds. If a job has a problem, or tends to be running a bit long, the next job picks up the next workload within 15 seconds. There are problems with this approach as well. First, you have 10 jobs constantly running at the same time for 30 minutes. I don’t know what kind of effect that would have on resources. The next problem is that during a server reboot, you would need to wait at most 30 minutes before the jobs start running again (If server comes back online at 12:01 or 12:31, jobs will not run until 12:30, 1:00).

sleep [delay 15,30,45,etc.]; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 150; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 150; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 150; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 150; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 150; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 150; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 150; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 150; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 150; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 150; /web/cgi-bin/php5 “$HOME/html/event.php”; sleep 150; /web/cgi-bin/php5 “$HOME/html/event.php”

posted by Dedric Mauriac on Applewood using a blogHUD : [blogHUD permalink]

No resellers please

December 9, 2009

I’ve setup the inventory script so that it does not pass on items that the owner did not create. Although people can get around this by putting the items within a prim that they did create, it’s more of a deterrant. On a prim with almost 2000 items, only 134 of them came through as having me as the creator. On the back-end, I was noticing that nothing was happening and it was taking a long time. I realized that the script was looping through a ton of items constantly. To circumvent possible lag, I’ve added a method to sleep for two seconds so that people don’t start abusing the regions performance that the servers are hosted on.
posted by Dedric Mauriac on Woodbridge using a blogHUD : [blogHUD permalink]

BlogHUD down

December 2, 2009

I noticed that all of my images on my blog are broken. I use a third party service called “BlogHUD” that accepts postcards from Second Life (R) as email and cross-posts them to my WordPress blog. Every now and then it doesn’t pick up on my messages. This is the first time that I’ve seen all of the images broken as well. All images are hosted on the BlogHUD site. I don’t have control over where they are hosted.

Oddly enough, I found a problem that I wasn’t aware of. Usually I can ping domain names using the command prompt. I tried it today and was shocked at the results. Apparently my operating system has no idea what “Ping” is.

Odd. Anyhow, I need sleep. night.

Lightweight Vendors

November 23, 2009

I usually come up with some interesting ideas when I go take a walk, or get a good nights sleep. The latest idea was how to improve the vendor experience. Most networked vendors are loaded with lots of scripts and prims. Using touch-based mapping, and the 5-faced prim that I use for my letter positioning, I can essentially create a vendor that has 10 preview panels, paging, and a main product presentation with 3 prims and 1 script. I’ve made a mockup and it looks like it can be done. I’ll need to get a rudamentary web interface working first assign images, note cards, and prices to items.
posted by Dedric Mauriac on Woodbridge using a blogHUD : [blogHUD permalink]

%d bloggers like this: