Webite up and app is done

The App is done!
The other site is done!
See the site and get the app here!
*I will try to continue to improve it and post the improvements here. Up next, make the app function in a browser so you don't have to download it and run it on your machine directly.*

After finishing the project and then getting ready to go home, I had completely forgot to update this website on how it went.

Overall I am pretty pleased with the project and I really enjoyed working in Processing for the semester. Unfortunately I am unable to continue to work with Ben Fry in the Spring semester due to scheduling conflicts with classes that I must take in order to graduate in the spring.

For those who are interested, you can find the class website that Ben made after the projects were completed here. 

One of my favorites is got to be Lee's visualization of music data from his Last.fm account. For those who do not know, Last.fm is a social music website where you can subscribe and with a small app running in the background, you can send them information of what music that you listen to. The purpose of this is to help you track your listening patterns and then match your music tastes to other people and in turn suggest other bands that you might want to listen to that they do. (Sort of sounds like a Big Brother thing, but I really like it, it has helped me find a few new bands that I really like now!)

If you like the app, let me know! It will encourage me to continue to work on even better versions of the program!

Until later,



After the storm ...

So the presentation went well the other day, but that still doesn't mean that the final product is done yet, still need to finish that other website documenting what I did as well as finishing out some other things I wanted to have in the project that just didn't have a chance to make it in.

So it also seems that I will have to cap the searches at a resonable level in order to get a faster return on the searches.  Due to the fact that I am pinging a certain server that hosts a webpage that has the AOL search data, it takes longer if I raise the maximum results that I want back. If anyone can figure out how to speed this process up at all after I realease the source code, please feel free to let me know how to do it.

Will update again as soon as I finish the project, which is due tomorrow.


Completed screenshot

Here is a screenshot of the final product!

Processing + Wiimote !

OK, so a few days ago I mentioned somehing about geting the wiimote to work within processing, well it turns out that someone just did that!

Go here to see his post and to download his stuff.

(I havn't got a chance to open the file or look at anything very closely, still rushing on that other project that is due in less than 6 hours!) 

Updated screenshot

Here is where I am at right now.

Are AOL users tech savvy or perverts? (The fact that they use AOL might give you a hint.)

(The lime green is "Beast" and the smaller one in the orange is "Engadget".)

OK, well I was running a few tests on the software today and I decided to search for bloggs like "Engadget" and "Gizmodo" (and if you are on this site and do not know what they are you probably are on the wrong site) and I saw that they actually had very little searches for them! I mean over three months there should have been more than a handful of users who searched for them. Maybe it was that they already knew the URL and didn't have to search for it or had it favorited.

But what I did run across (while searching "macbook" and then navigating to "macbook pro review") was that people who use AOL like to search for porn, A LOT of it. I have no idea what kind of a person would search for "Beastiality reviews", but it was towards the top! In fact, due to the code at the time which would search for every word in the phrase, it turned up an aweful lot of pornographic reviews that people searched for.

But I will let you be the judge of that.

*The current program, because of reprogramming, does not run across this interesting *fact* about AOL users. BUT you can find it in the earlier draft that I published down below and a current search of "beast" (over 300 results, where my program is capped at right now for coding) and "engadget" (with only 20 results) *


Crunch Time, Countdown to completion 27 hours

So I am in a serious crunch time right now in order to complete the project. The final presentation of the project is due tomorrow at 6:30 and I need to have this completed, another webpage "documenting/explaining" the project (which this might do just fine, if so, a LOT of things will be posted tomorrow, if not I will post a link to the page).

But as far as the project goes, I think I am going to be changing what webserver that I leech off of because the current one that I use searches for EVERY WORD in a search term and returns them as the result, which basically invalidates what I am doing. Man I hope recoding that doesn't take too long....

More later....



A bit off topic, Darwiin Remote. Use your Wiimote with a Macbook Pro

So the other day I was checking my RSS feeds and on either engadet or gizmodo I came across something called the Darwiin Remote.

*If that link is dead, which it seems to be at the current time that I am writing it, you can see what it looks like and a short article on it over here at The Unofficial Apple Weblog

This little app lets you use a wiimote with your Macbook Pro. Right now all it will do is move the cursor around the screen as you roll the controller left or right, tilt it up or down in order to move it in that fashion, and sense the accelerometer inside of it. Right now it doesn't really do anything interesting, but I think it has some potential. Especially with Processing.

The Japanese programmer who developed this also released the source code to go with it, so hopefully soon some great programmer can turn that into a library for Processing.

*And for those of you that are using a PC, I have heard that there are other projects and cool things being done at WiiLi.org. Again, at this time, it seems that the WiiLi website does not have anything up there at this time, but I hear that when it is up, it has a lot of things/projects that people talk about on their forums.

Until later,

A new direction

Discussing more of the project with Ben today. Through the discussion today that I had with him, I realized that I want to show a lot more than what the current program shows. It seems that all the data that I keep getting from it looks fairly consistant and I doubt that I will be able to get any real information out of the project if I continue in the way I originally planned. According to my original plan, I wanted to be able to search for a certain term (i.e. pizza) and then from there not only see the way the term was searched over time, but to have the program pull out of the database similar terms (i.e. pepperoni, hut, dominos) and from there be able to click on those terms to search them and compair them to the previously searched terms.

Now I sort of have that functionality set into the program and it works to some extent. It is far from being completed to a level that I would find it acceptable to be done with it, but I think that it is complete enough to know that presenting that is the wrong path I want to follow.

So still staying with that same *basic* idea, I believe that I will be retooling the data in order to get back to the idea of navigating the data connections. Should see some development over the weekend on this, but finals are really going to be rough this year with almost all of them on monday and tuesday. Luckly I have the processing one on Thursday!


New version of Processing out

A few days late, but I wanted to post up that there is a new update of Processing on the website. We are up to Processing 0123 now. (Link to processing.org the right.)

A suprize to me because I somehow missed 0122.

** Notes on the beta of the AOL search day program**

So first of all, right now it is limited to present the first 30 results of a search term. This can be easily changed in the Search Tab on the 10th line that reads "static final int MAX_RESULTS = 30;"

The second thing that I needed to say was that when you search for a term, depending on how much it was search can make a big difference on how long it takes to get a result back. Try asdf and you will get something back very fast with only 4 entries. Enter in "Google" and it will take a long time because it was the most searched termed (kind of ironic really.) It also seems to take a long time if you enter a string of common words as well. 

First archived sketch, not perfect, not by a long shot

So I feel that it is about time that I release the first version of the AOL search data to the world. Hopefully I can get some useful feedback, so feel free to get back to me about this.


I also thought that you guys might want to see some of the original sketches and the very start of the project, so without further ado, here is the first sketch that I did with the program searching the database for 'torrent' and a early comped image of what I thought this project was going to head from the begininng.


Meeting with Ben

I met yesterday with Ben Fry to go over my project and we discussed the code and how it would work. After helping me past some sticking points, we discussed possibly going in a different direction with the interface and the visual presentation of the data.

Since I have gotten some of the data to appear in the chart form like I wanted it to, I am noticing that looking at the data this way is starting to look all the same (especially the averages for the day of the month or the hour of the day.) So we discussed other ways that we could modify the data that I am bringing in that would rather be much more interesting when searching through it.

Some of the ideas that we discussed was to use the data to show connections a la visual thesaurus, which could be potentially interesting. We also talked about moving in and out between the data in more of a list view where the user can quickly move between things and see the various connection between search terms.

I wish I could devote most of my time to this project, but as finals are coming up, I am finding that I am having less and less time than I anticipate. I should post some images here soon and a copy of the program once I can get it working so that people can run it off of a webpage, which it can not do currently because it is accessing another server.

Until next time,



Long time since an update, things coming along well.

After working on the program in the dark for a long time, I have finally figured out most of the code to do what I wanted to do. Still some bugs and a lot more features that I want to add in like being able to save data in a graph to compair more than one search at the same time. Will post a image and the current program soon as soon as I get some more free time and work on it for the rest of the night.


Update on the project

So I have been thinking about the project recently. At first, I wanted to do something that would graph out the data of a certain searched term that the user would input and display it in an average month, week, and daily form. But I have been thinking recently that maybe instead of using these three graphs and being able to overlay other searched terms that would be discoverable through the interface, to move to an entirely discoverable program that would allow the user to navigate through the data in a fluid motion with terms similar to the searched keyword discribed in a list of links as well as other top searches by the group or possibly the individual.

Some sketches of this and other possible interfaces to be uploaded soon, let me know what you think about it.



A little bit about me

I am a senior at Carnegie Mellon University where I study Communication Design and Human-Computer Interaction. I am currently enrolled in a senior design studio course titled "Computational Information Design" which is being taught by Ben Fry, one of the people who developed the software we are using for the course, Processing (check out the link on the side). The purpose of the course is to teach us how to do design work using programming as a tool instead of just the Adobe Creative Suite products.


Welcome + Birth of the project

Processing AOL was created as a way for me to archive and display this project to the rest of the world using the Processing, which was developed at the MIT Media Lab by Ben Fry and Casey Reas. Processing AOL is a project in which I am going to try to take the AOL search data that was released over the summer of 2006 and try to compile it in a unique way that shows patterns through the search data based on relative searched terms, search times and other variables.

I am going to be working on and should be completed by early to mid December. Once it is completed I will upload this application to this blog so everyone out there can play around with it and give me feedback on the project.

I am really excited about this project and will try to keep this blog updated as I work through this problem.

Until then, I will try to keep an open dialog on this page and hope to hear from all of you out there.