Monday, November 26, 2012

Dell Vostro without Windows Tax

I guess today is "Cyber Monday" or something like that? I'm looking for a reasonably priced laptop for my 13 year old daughter and saw that I had an email from Dell with some offers in it. After a little clicking around to compare different laptops, I ended up on a comparison page for the Vostro line. Naturally, I was drawn to the $299 priced laptop, when I noticed that there are 2 computers that differ only in the OS pre-installed, AND $70! This is the first time I've just naturally stumbled upon such a blatant exposure of the Windows tax.

Monday, November 19, 2012

The Road to 14.04

Copenhagen was quite a UDS. I found it to be very "tight" and well organized. The sessions seemed to be extra productive  I think compressing UDS down to four days helped us be more focused, and also less tired by the end. Maybe next one should be three days? Copenhagen was also great for the content. Working on getting the desktop running on the Nexus 7 was very interesting and fun. Also, we made a lot of progress in terms of how we make Ubuntu. I think this will be an unusually fun cycle thanks to some of the changes to our development process.
Sleestacks are monsters bent on keeping you in the past

One of the great things about my job is that I get to talk to so many people about their vision for Ubuntu. As you can imagine, I run into a lot of variety there. However, there is a certain shared vision that has come together. I think we all look forward to 14.04, we can mostly agree about what we want that release to look like.

Seeing the future through conversations with community members, developers, users, etc....
The 14.04 Vision
Imagine that you are running Ubuntu 14.04. What will your experience be like?

  • Robust
  • Fresh
  • Easy to Make
  • Ubiquitous

First, the quality will impeccable  You will not even think about up-time  The system will work close to perfectly. You will eagerly take updates and upgrades knowing that they will only make your system better. Applications will run smoothly and won't cause any system-wide problems.
Secondly, you will be able to get the freshest apps that you want on you client machines, and the freshest workloads on you servers. As a developer, you'll be able to deliver your applications directly to end users, and be able to easily update those applications.
Thirdly, will have an extremely efficient release process, one that also inspires confidence in developers. Good changes will reach end users quickly, while mistakes are easily caught and corrected well before users are exposed to them.
Finally, by 14.04 Ubuntu will run everywhere. The same Ubuntu will be on you phone, your tablet, your netbook, your laptop, your workstation, you cloud server hosts, and the instances powering workloads in your public and private clouds. The same product with the same engineering running everywhere. A simpler world with Free code everywhere.

How Do We Get There?

"It's more important to know where you are going than to get their quickly"
We have laid out the steps necessary to achieve this vision. We intend to make this real! In fact, we've already achieved some of the things necessary to get to where we want to where we want to be for 14.04. Overall, by 14.04 we need to:

  • Assure quality at every step of the development process
  • Improve application sand-boxing
  • Simplify the release schedule
  • Implement continuous integration in the Ubuntu development process
  • By 14.04 expand Ubuntu to include mobile form factors, such as phones and tablets

Assured Quality at Every Step

Leaping through time ensuring everything stays on track
In 13.04 we will change our full testing cadence from testing at each Alpha or Beta milestone, to doing full test runs every 2 weeks. This is about a 3 times increase in the rate of manual community testing. Furthermore, we will test more broadly, more deeply, and more rigorously, so that we will have a more complete view of the quality of Ubuntu during the development release.
We will also be leveraging some previous work to create a GUI testing tool. We call this tool "Autopilot" and it is designed to drive all the components of Unity in a testing environment. In 13.04 we will see expanded usage of this tool, and critically, dramatically more tests written. We will then be able to catch regressions in the Ubuntu user experience earlier, and ensure that fewer regressions make their way into the development release.
The one and only Martin Pitt has implemented a new test harness along with tests for GNOME. In this way, the Canonical QA labs will be able to identify regressions in GNOME as soon as they are introduced. This should allow GNOME developers to more quickly spot and fix problems, raising the overall quality of the GNOME code and improving the velocity of the GNOME developers.
Finally, after 13.04 ships, we will start doing updates in a new way. After 13.04 is a stable release, updates to that release will not be delivered to all users when available. Rather, updates will go out to a small number of users, and the system will automatically monitory whoopsie-daisy to ensure that users aren't experiencing issues due to the update before releasing the update to yet more users. We call this "phased updates".

Ubuntu Continuous Integration

Every day is like the day before, Daily Quality
There are 2 new things that we are doing in 13.04 to get us to the world of 14.04 where releases are easy and confidence inducing. First, Colin Watson has set up the Ubuntu build system so that all packages are built in a staging area (by reusing the proposed pocket for the development release). Only when a package is built succesffull along with all of it's dependencies are the packages copied into the release pocket and go out to the wider development release. This means that there will be no more breakages due to out of sync packages when you update. Compiz and Nux will always be built together before they are copied over. The whole xorg stack too.
Building things in proposed provides an opportunity to assure the quality of the packages before they go into the release pocket. This will be accomplished with auto-package testing. Essentially, tests that come with a package will be run when the package is built. Additionally any package that depends on the new package will also run it's auto-package tests. The package will only be copied into the release pocket when all of the tests pass!

Start Application Insulation

Protecting the world from pure evil

By 14.04 we expect most applications to be run in a secure manner, so that poorly written or even malicious applications will have limited opportunity to do damage to a users system. In 13.04 the Security Team is moving ahead with lots of work to enable App Armour throughout Ubuntu, in addition to isolating some common infrastructure in use today, such as online accounts, gnome-keyring, and even dbus.
In this way, applications will be able to run and access only the small subset of system that is relevant to them. When a user installs an application it will come with an AppArmour profile that will ensure that the kernel can insulate the system from the application appropriately. The fruits of this labor should be widely visible by 13.10.

Simplified Releases

A time turner literally creates more time
Ubuntu has traditionally held a serious of Alphas and Betas. These had the purpose of ensuring that we had an installable image at least a few times during the release, and to provide an opportunity to do some wide testing of the system. This meant that several times throughout the release cycle we would stop development on Ubuntu, freeze the archive and roll a release.
Since the advent of daily quality, Ubuntu can install pretty much every day. Furthermore, we are opting for much more frequent testing than the milestones allowed. Therefore, the Alphas and Betas have limited utility, but would have continued to sap our development velocity. So, in 13.04, Ubuntu is making the bold move of skipping all Alphas, and having just a single Beta! This also allowed us to extend certain freezes, especially Feature Freeze. The new schedule has a much more time for finishing features and fixing bugs, and much less time in freezes.

Monday, June 18, 2012

Let's Roll with 12.10

As a consequence of our daily quality efforts, some very interesting developments have taken place for 12.10.

First, while knocking around UDS, it occured to me in a bit of a flash that all of the effort that we invest in freezing the archives to make Alphas and Beta releases for Ubuntu is wasted work that slows down our velocity. We have daily quality and we have started using -proposed in the development release, so the chance of having an uninstallable image is greatly reduced.

(you can read the discussion on @ubuntu-devel)

So, why do we have Alphas and Betas? After some discussion, it seems to come down to:

  1. Because we want to encourage widespread testing by community members on a variety of hardware at a regular cadence
  2. Because we want targets for features and bug fixes
  3. Because we need to test our ISO production capabilities
  4. Because we always had them

Does all the effort in freezing the archive actually help? I don't think so. In fact, I think it is counter-productive.

  1. We can do the same testing with daily images. Furthermore, we can do that testing at a cadence of our liking, or even out of cadence if we want to squeeze in a special test run at some point. The ISO tracker nicely accomedates this now.
  2. Freezing the archive, by definition, *stops* packages and therefore bug fixes and features from getting uploaded. 
  3. Surely we don't need to slow down everyone's work so that we can try producing ISOs, and surely we don't need to do it so often and early.
  4. Of course, "because we always did" is not much of a reason.

It seems that what is needed is a regular cadence of deep and broad testing by the community to augment our automated tests, along with trial runs to ensure that our ISO building tools and process are working. Thefore, I propose we:

  1. Stop with the alphas and betas and win back all of the development effort
  2. Increase the cadence of "ISO testing" to whatever we want or whatever the community team can manage
  3. Spin a trial ISO near what is not beta time
  4. Spin ISOs for release candidates

Tuesday, June 12, 2012

«Bonjour Le Monde»

Pardonnez-mois pour le massacre d'une belle langue ...

Je rêvais de donner un cours de la progrommation en français. Ce cours aurait deux objectifs. Le premier objectif est l'introduction du monde de la programmation. Le secondaire objectif est la pratique au français pour mois. Les etudes apprennent la programmation, et j'apprends à parler plus français.

Donc, je vous présente:
La Programmation pour Les Debutantes Absolus, En Mauvais Français

Je vais donner le cours dues fois par semmainne, dans la soir. Aussi, je vais avoir les heures de beaureau pour les questions et pour les discussions.

Nous allons utiliser Apprendre à programmer avec Python. Je vais couvre duex chaptre chaque classe, mais nous allons arrêter en chapter 12 ou avant. Donc, le cours sera en l'environ de tois semmaine.

Nous allons utiliser Google Hangouts pour les classes. Donc, c'est neccesaire d'avoir un webcam et microphone, mais le cours est gratuit. Aussi, le cours utilise ubuntu, naturalement ;)

En fin, j'ai besoin trouver les étudiants. Si vous voulez commencer à apprendre la programmation (et m'aidez avec français), vous pouvez laisser un comment ici, ou vous pouvez me trouver dans irc (rickspencer3 sur freendoe), ou m'envoyer un email. Après je trouve les ètudiants, nous allons trouver les bons heures pour les classes.

Tuesday, April 24, 2012

Making GtkWebKit Inspector and enable-developer-extras actually work in Python

I have become quite fascinated by using HTML5 for rendering my GUIs on my Ubuntu applications. I love doing this, because I can continue to use Python as my library and desktop integration point, while being free to use cutting edge presentation technology.

I sat down with didrocks yesterday and we set off to create a simple Quickly template out of some of the code I've written for bootstrapping these projects. The template will be very very simple. All it will do is set up communication between HTML and Javascript in an GtkWebKit window, and a Python back end. Developers will be free to choose how to use the WebKit window. For example, they could use JQuery or a host of other javascript libraries if they chose.

Didrocks was adamant that we should expose the excellent debugger (called The Inspector) that comes with WebKit in the template. However, I have found that for GtkWebKit, the doucmentation is sketchy (at best), and the API is unpredictable in it's behavior. So, it took us 2 hours of experimentation and trolling source code to make an implementation that actually worked for showing The Inspector.

So, if you have been trolling the web looking for how to make this work .. I hope this works for you! Without further ado, here is a commented minimial example of a WebKit window that shows the Inspector. I also pushed a branch with just the code in case you find that easier to read or work with.
 from gi.repository import WebKit  
 from gi.repository import Gtk  
 import os  
 #The activate_inspector function gets called when the user  
 #activates the inspector. The splitter is a Gtk.Splitter and is user  
 #data that I passed in when I connected the signals below.  
 #The important work to be done is to create a new WebView  
 #and return it in the function. WebKit will use this new View  
 #for displaying The Inspector. Along the way, we need to add  
 # the view to the splitter  
 def activate_inspector(inspector, target_view, splitter):  
   inspector_view = WebKit.WebView()  
   return inspector_view  
 #create the container widgets  
 window = Gtk.Window()  
 splitter = Gtk.Paned(orientation=Gtk.Orientation.VERTICAL)  
 #create the WebView  
 view = WebKit.WebView()  
 #Use set_property to turn on enable-developer-extras. This will  
 #cause "Inspect Element" to be added to the WebKit's context menu.  
 #Do not use view.get_settings().enable_developer_extras = True,   
 #this does not work. Only using "set_property" works.  
 #Get the inspector and wire the activate_inspector function.  
 #Pass the splitter as user data so the callback function has  
 #a place to add the Inspector to the GUI.  
 inspector = view.get_inspector()  
 inspector.connect("inspect-web-view",activate_inspector, splitter)  
 #make a scroller pane to host the main WebView  
 sw = Gtk.ScrolledWindow()   
 #put something in the WebView  
 html_string = "<HTML><HEAD></HEAD><BODY>Hello World</BODY></HTML>"  
 root_web_dir = os.path.dirname(os.path.dirname(__file__))  
 root_web_dir = "file://%s/" % root_web_dir  
 view.load_html_string(html_string, root_web_dir)  
 #show the window and run the program  

Friday, March 30, 2012

Thanks mterry! (Quickly Tutorial Updated) :)

So I decided I had to bit the bullet this morning and update the ubuntu-application tutorial for Quickly, since desktopcouch is no longer supporter and I therefore removed CouchGrid from quickly.widgets. So I start looking through the tutorial to make notes about what I need to change, and I find everything already fixed by Michael Terry! Amazing. I love working (or in this case not working) on open source projects ;)

Thursday, March 29, 2012

12.04 Quality Engineering Retrospective

Ubuntu 12.04 LTS Development Release (Precise Pangolin) Beta 2 is (most likely) going to be released today. This means we are getting quite close to final release! I have been running Precise as my only OS on both of my computers for months now, and it is far and away my favorite desktop I've ever used. It is beautiful, fast, and robust. This post is about the robust part.

After we release Beta 2, we should continue see the Ubuntu and Ubuntu Server improving day by day and to quickly achieve release quality. I have asked Kate Stewart, our release manager, to do everything in her power to ensure that starting with Final Freeze on April 12th each daily image is high enough quality that it could be our final release.

Why am I so confident that Ubuntu will only get better and better? Because Ubuntu stayed of usable quality throughout the development cycle. This created a virtuous cycle, where it was easier to develop and test with, so was then easier to maintain the quality.

After the last UDS, I described how we planned to maintain quality throughout the release. We followed those plans, and got the expected results. I am very very proud of

But rather than repeating the activities that we did, I thought I would look back and see what values arose from those practices. I think it was the following values that really had the impact, and that we should build on for 12.10 and beyond.
  1. Verify and fix before landing major changes in Ubuntu
  2. Not waiting when something breaks to take action
  3. Test for testability, then test rigorously
Let me provide some specific example for each.

Verify And Fix Before Landing

Previously, teams would rush to meet certain development milestones, with the goal of meeting the letter of the law. A package had to be uploaded before Feature Freeze, for example, so a team would just push what they had, even if it was not proven to work, or even know not to work at all!

In 12.04 we took a different approach with packages that tended to have significant impact on the usability of Ubuntu, or that were otherwise important. In fact, the xorg team has been following this approach for many releases, using their "edgers" PPA and calls for testing. For many releases new versions fof X were vetted by a community of dedicated community testers before being uploaded to the development version of Ubuntu. In 12.04, they took this even a step further. In previous releases, while different parts of the X stack were building, user of the development release might upgrade while the archive was in an inconsistent state, because different parts of the new X stack were built while others were still builiding or waiting to build. This could result in situations where X was uninstalled altoghter! In 12.04, the X team actually built the X stack separetly, and then copied the binaries into the archive. Totally verified and fixed before landing!

Many folks have noted the dramatically increased robustness of Unity during the 12.04 development cycle. The Unity team did a lot of work to tune and improve their development processes. This included using a PPA for each new release of Unity, and then having that release rigorously tested (with test cases, a testing tool, etc...) by community members with different kinds of graphics hardware and other setups. Then regressions and problems were fixed in the PPA, testing repeated, and only then being uploaded to the development release.

Ultimately, though, I think Ubuntu Cloud must take the prize for rigor in this area, with their OpenStack testing and verification process. On each and every commit to OpenStack uptream, OpenStack gets deployed to a Canonical Cloud test bed (deployed with Juju, of course), then a full suite of tests run. If the tests pass, it gets automatically built into a PPA. When the team is ready to do a release into Ubuntu, they can make the many necessary tweaks in the PPA before uploading it to Ubuntu. This level of Precision allowed the Server team to stay with cutting edge OpenStack, while maintaning a system that was always working, and therefore testable.

Don't Wait when Something Breaks

This value has really taken hold in the Ubuntu community, and it has really helped. There are 2 areas that I monitor each morning. First, I check how the arcvhices look. I can do this because the Plus One Maintenance team, led by Colin Watson and Martin Pitt in turns, have written a tool that finds problems in he archives. Furthermore, each morning they strive to fix those problems. In this way, uninstallable packages and other problems are fixed before we try to spin that day's daily ISO.

After spinning the daily ISO the QA team runs a set of smoke tests on them. If the tests can run, or fail, the right engineering teams are notified, and either they try to fix the tests, or fix the test failure so we can try spinning the CD again. The daily response meant that it was pretty certain that issues were introduced in the last 24 hours, which in turn made them easier and faster to resolve.

Still, Ubuntu development is incredibly rapid. We didn't want to set up a situation where people were afraid to make changes because they might break something. Therefore, from the beginning of the cycle, we accepted that our testing would not catch everything, and that some things would break. So, we set the goal of quickly reverting changes that caused the development release to be hard to test or use. We only had to resort to this a few times. For example, at one point, LightDM was not able to load any but the default desktop. As a result, it was not possible to use desktops like Kubuntu, Xubuntu, etc... The change was reverted the same day so that testing could continue.

Test for Testability, then Test Rigorously

So, we now have automated testing of Canonical upstream code, as well as daily images and daily upgrade testing. However, we don't consider this the end of the testing process, but the beginning. In other words, we use the automated tests to tell us if the code trunks and images are worth testing harder.

In 12.04 development, we evolved our community testing practices to meet this needs. In the past we would do a "call for testing" which mean "please update and try out Ubuntu, let me notice if anything broke". In 12.04 a "call for testing" changed to include test cases so that we could know what worked, not just what broke, coverage of hardware and configurations by recruiting community members who had the right setups, and organized results.

This thought process was not limited to our only Canonical produced code, however. Before or soon after introducing potentially distruptive changes Nicholas Skaggs, our new Community Team member, collects test cases from the relevant developers, and than organizes community members to execute those test cases. He is also organizing these tests at important milestones, such as Beta 1 and now Beta 2.

Wednesday, March 21, 2012

I less than 3 Pychart (or how I turned tracking 10 bugs into a programming task)

We are getting closer to release! Beta2 freeze is tomorrow. Quality in 12.04 is looking very good today. However, we still see hundreds of bugs get fixed across desktop and server between now and April 26th. In the past, I've found that in the flury of activity it's easy to lose track of the most important bugs in all that noise, and then some scrambling ensues.

To counteract this, at least for myself, I had a couple of calls, with Jason Warner (Desktop Engineering Manager), Robbie Williamson (Server Engineering Manager), and Steve Langasek (Foundations Engineering Manager). We talked about what bugs we had (that we know about now) that would actually keep us from releasing as scheduled. We have a term called "release blocking bug", but in point of fact, almost none of them would actually keep us from releasing. The kinds of bugs that would truly make us slip a ubuntu release are ones that cause problems with existing OSs in multi-boot situations, serious bugs in the installer, serious bugs in update manager, bugs that result in a loss of networking, etc... Bugs that can reasonably fixed in an update do not block the release.

We decided that the best way to keep track of the very few bugs like this is to continue to track them as normal, but to set their importance as critical.

There is another set of bugs that I also ask the team to focus on. This set is more aspirational. I want us to fix all of the upgrade bugs that we find from automated testing, or at least all of the High and Critical importance ones. I would sincerely love to see every upgrade go smoothly for all of the millions of people who will be upgrading to Precise.

So, when I am going to start talking about pychart? Right now, in fact! Keeping tabs of bugs is boring, so must be automated, and I love automating things with Python. So, I wrote a program that scrapes the data from those 2 pages, store the info in a sqlite database, and generate a line graph each time I run it.

You can see all the code here if you want, but I doubt you do, it's pretty hacky. But, it was fun to bring together the ecellent json, HTMLParser, sqlite3, and pychart libraries.

Here's the pychart money shot:
        xaxis= axis.X(label=_("Date"), tic_interval=1,format = format_date)
yaxis = axis.Y(tic_interval = 2, label="Open Bugs")
ar = area.T(x_axis=xaxis, y_axis=yaxis, y_range=(0,None))
plot = line_plot.T(label="upgrade", data=graph_list, ycol=1,
plot2 = line_plot.T(label="blockers", data=graph_list, ycol=2, tick_mark=tick_mark.square)
ar.add_plot(plot, plot2)

can = canvas.init("/home/rick/Documents/bugs.png","png")
print ar.draw(can)
def format_date(ordinal):
d =
return "/a60{}" + d.strftime("%b %d, %y")

Tuesday, February 21, 2012

GObject Introspection Prompts

Dang, I hate how I often but "GIO" instead of "GOI".

Anyway, I'm starting a week of focusing on coding. Unfortunately I have a bunch of meetings that I cannot escape, but otherwise, I cancelled all non-essential meetings, and will be diving into the platform and working with the real application developer experience on Precise. Also, I have a few work items that I should really take care of.

Today, I started with a bite-sized morsel. I update quickly.prompts to use gobject introspection. The key value here being that you can now use quickly.prompts with a modern Quickly app.

The branch is waiting to be reviewed and merged here.

Wednesday, February 8, 2012

Girrrr: PyGame + Gtk in a GOI World

Back in August, I wrote a bit about how to embed PyGame into a pygtk app (and why it might be interesting to do that). Well, the world has moved on a bit, so today I updated the code sample to work with GObject Introspection.

It wasn't too hard to do, but did take a bit of digging around. I created a diff between the files and then commented on the diff, so you can see the required changes.

 === modified file ''
--- 2011-08-25 12:14:00 +0000
+++ 2012-02-08 10:22:50 +0000
@@ -1,41 +1,41 @@
import pygame
import os
#you can't import Gtk and GObject in the old way
#so delete these imports
-import gobject
-import gtk
#I haven't made quickly prompts work with introspection yet
#I think it will be easy, but in the meantime, we can't use
#quickly.widgets or quickly.prompts
-from quickly import prompts
#here's how to import GObject and Gtk
#you have to import GdkX11 or you can't get a widget's xid
+from gi.repository import GObject
+from gi.repository import Gtk
+from gi.repository import GdkX11
#"gtk" has to be changed to "Gtk" everywhere
#I used find and replace for this
-class GameWindow(gtk.Window):
+class GameWindow(Gtk.Window):
def __init__(self):
- gtk.Window.__init__(self)
- vbox = gtk.VBox(False, 2)
+ Gtk.Window.__init__(self)
+ vbox = Gtk.VBox(False, 2)
#create the menu
- file_menu = gtk.Menu()
+ file_menu = Gtk.Menu()
- accel_group = gtk.AccelGroup()
+ accel_group = Gtk.AccelGroup()
- dialog_item = gtk.MenuItem()
+ dialog_item = Gtk.MenuItem()
- quit_item = gtk.MenuItem()
+ quit_item = Gtk.MenuItem()
- menu_bar = gtk.MenuBar()
+ menu_bar = Gtk.MenuBar()
vbox.pack_start(menu_bar, False, False, 0)
- file_item = gtk.MenuItem()
+ file_item = Gtk.MenuItem()
@@ -44,10 +44,10 @@
#create the drawing area
- da = gtk.DrawingArea()
+ da = Gtk.DrawingArea()
- vbox.pack_end(da)
#pygtk didn't require all of hte arguments for packing
#but Gtk does, so you have to add all the arguments to pack_end here
+ vbox.pack_end(da, False, False, 0)
#set up the pygame objects
@@ -70,7 +70,15 @@
self.y += 5
def show_dialog(self, widget, data=None):
-"A Pygtk Dialog", "See it works easy")
+"A Pygtk Dialog", "See it works easy")
#I just hand crafted a dialog until I can get quickly.prompts ported
+ title = "PyGame embedded in Gtk Example"
#a lot of the constants work differently
#gtk.DIALOG_MODAL -> Gtk.DialogFlags.Modal
#gtk.RESPONSE_OK -> Gtk.ResponseType.OK
#There's some info here to get started:
#but I found that I had to poke around with ipython a bit to get it right
+ dialog = Gtk.Dialog(title, None, Gtk.DialogFlags.MODAL,(Gtk.STOCK_CANCEL, Gtk.ResponseType.CANCEL, Gtk.STOCK_OK, Gtk.ResponseType.OK))
+ content_area = dialog.get_content_area()
+ label = Gtk.Label("See, it still works")
+ content_area.add(label)
+ response =
+ dialog.destroy()
def quit(self, widget, data=None):
@@ -87,14 +95,14 @@
return True
def _realized(self, widget, data=None):
#since I imported GdkX11, I can get the xid
#but note that the properties are now function calls
- os.putenv('SDL_WINDOWID', str(widget.window.xid))
+ os.putenv('SDL_WINDOWID', str(widget.get_window().get_xid()))
pygame.display.set_mode((300, 300), 0, 0)
self.screen = pygame.display.get_surface()
- gobject.timeout_add(200, self.draw)
+ GObject.timeout_add(200, self.draw)
if __name__ == "__main__":
window = GameWindow()
- window.connect("destroy",gtk.main_quit)
+ window.connect("destroy",Gtk.main_quit)
- gtk.main()
+ Gtk.main()
I pushed the example to launchpad, in case you want to see the whole thing in context.

Thursday, January 12, 2012

Bit of fun with JQuery and CSS

I stole some time to play a bit more with veritas and JQuery today. Instead of the ugly list that I had before, I wanted some interactivity. So I got started by adding a little css to make a "card" for each bottle.

Then I wrote a bit of javascript to make each div that I pass into the html into a JQuery "draggable", and do a bit of cheap layout.

    else if(signal == "add_bottle")
div = jQuery(data,{}).draggable();
div.css('left', lft);
lft += 10;
tp += 10;
$( "#bottle_div" ).append(div);
Next I'll add some nicer layout. Then I'll start adding filters and dropdowns so I can sort and do other fun stuff.