Thanks to the hard work of Jeremy Bicha and others Ubuntu GNOME is now an official Ubuntu flavour. Flavours get some infrastructure and support benefits such as ISO creation that make it easier to release and support.
Tuesday, March 12, 2013
More information on Mir
With the recent announcement of Mir there's been some concern about what this means for Ubuntu and the wider Linux ecosystem. Christopher Halse Rogers who is on the Mir team has written some excellent posts covering some of the major questions: why Mir and not Wayland/Weston, what does this mean for other desktops on Ubuntu and what does this mean for Linux graphics drivers.
Well worth the read.
Well worth the read.
Tuesday, March 05, 2013
Mir
Today
we go public with the Ubuntu graphics stack for the post X world. Since
the beginning Ubuntu has relied on the X server to support the user
experience and while it has worked generally well; it’s time for
something new. My team is working on a big new component for this - Mir. Mir is a graphics technology
that allows us to implement user experience we want for Ubuntu across
all devices we support.
In
many ways, Mir will be completely transparent to the user. Applications
that use toolkits (e.g. Qt, GTK+) will not need to be recompiled. Unity
will still look like Unity. We will support legacy X applications for
the foreseeable future.
This
is a big task. A lot of work has already been done and there’s a lot
more to go. We’re aiming to do incremental improvements, and you can
find more about this on the Wiki page and in the blueprints. You can help.
From today our project is public, it’s GPL licensed and you’re welcome
to use the source and propose changes.
It’s exciting times, and I hope you enjoy the results of this work!
Thursday, January 10, 2013
Vala support for Protocol Buffers
Recently I was playing around with Protocol Buffers, a data interchange format from Google. In the past I have spent quite a bit of time working with ASN.1 which is a similar format that has been around for many years. Protocol buffers seem to me to be a nice distillation of the useful parts of efficient data interchange and a welcome relief to the enormous size of the ASN.1 specifications.
With Vala being my favourite super-productive language I felt the need to add support to it. Solution: protobuf-vala.
Let's see it in action. Say you have the following protocol in rating.proto:
message Rating {
required string thing = 1;
required uint32 n_stars = 2 [default = 3];
optional string comment = 3;
}
Run it through the protocol buffer compiler with:
$ protoc rating.proto --vala_out=.
This will create a Vala file rating.pb.vala with a class like this:
public class Rating
{
string thing;
uint32 n_stars;
string comment;
etc
}
You can use this class to encode a rating, e.g. for storing to a file or sending over a network protocol:
var rating = new Rating ();
rating.thing = "Vala";
rating.comment = "Vala is super awesome!";
rating.n_stars = 5;
var buffer = new Protobuf.EncodeBuffer ();
rating.encode (buffer);
do_something_with_data (buffer.data);
And decode it:
var data = get_data_from_somewhere ();
var buffer = new Protobuf.DecodeBuffer (data);
var rating = new Rating ();
rating.decode (buffer);
stderr.printf ("%s is %d stars\n", rating.thing, rating.n_stars);
That's pretty much it!
If you're using Ubuntu (12.04 LTS, 12.10 or 13.04) then you can install Vala protocol buffer support with:
$ sudo apt-add-repository ppa:protobuf- vala-team/ ppa
$ sudo apt-get update
$ sudo apt-get install protobuf-compiler-vala libprotobuf-vala-dev
Have fun!
With Vala being my favourite super-productive language I felt the need to add support to it. Solution: protobuf-vala.
Let's see it in action. Say you have the following protocol in rating.proto:
message Rating {
required string thing = 1;
required uint32 n_stars = 2 [default = 3];
optional string comment = 3;
}
Run it through the protocol buffer compiler with:
$ protoc rating.proto --vala_out=.
This will create a Vala file rating.pb.vala with a class like this:
public class Rating
{
string thing;
uint32 n_stars;
string comment;
etc
}
You can use this class to encode a rating, e.g. for storing to a file or sending over a network protocol:
var rating = new Rating ();
rating.thing = "Vala";
rating.comment = "Vala is super awesome!";
rating.n_stars = 5;
var buffer = new Protobuf.EncodeBuffer ();
rating.encode (buffer);
do_something_with_data (buffer.data);
And decode it:
var data = get_data_from_somewhere ();
var buffer = new Protobuf.DecodeBuffer (data);
var rating = new Rating ();
rating.decode (buffer);
stderr.printf ("%s is %d stars\n", rating.thing, rating.n_stars);
That's pretty much it!
If you're using Ubuntu (12.04 LTS, 12.10 or 13.04) then you can install Vala protocol buffer support with:
$ sudo apt-add-repository ppa:protobuf-
$ sudo apt-get update
$ sudo apt-get install protobuf-compiler-vala libprotobuf-vala-dev
Have fun!
Thursday, December 27, 2012
A script for supporting multiple Ubuntu releases in a PPA
Something I find time consuming is uploading to a PPA when you want to support multiple Ubuntu releases. For my projects I generally want to support the most recent LTS release, the current stable release and the current development release (precise, quantal and raring when this was written).
I have my program in a branch and release that with make distcheck and lp-project-upload. The packaging is stored in another branch.
For each release I update the packaging with dch -i and add a new entry, e.g.
myproject (0.1.5-0ubuntu1) precise; urgency=low
* New upstream release:
- New exciting stuff
-- Me<me@canonical.com> Thu, 27 Dec 2012 16:52:22 +1300
I then run release.sh and this generates three source packages and uploads them to the PPA:
NAME=myproject
PPA=ppa:myteam/myproject
RELEASES="raring quantal precise"
VERSION=`head -1 debian/changelog | grep -o '[0-9.]*' | head -1`
ORIG_RELEASE=`head -1 debian/changelog | sed 's/.*) \(.*\);.*/\1/'`
for RELEASE in $RELEASES ;
do
cp debian/changelog debian/changelog.backup
sed -i "s/${ORIG_RELEASE}/${RELEASE}/;s/0ubuntu1/0ubuntu1~${RELEASE}1/" debian/changelog
bzr-buildpackage -S -- -sa
dput ${PPA} ../${NAME}_${VERSION}-0ubuntu1~${RELEASE}1_source.changes
mv debian/changelog.backup debian/changelog
done
Hope this is useful for someone!
Note I don't use source recipes as I want just a single package uploaded for each release.
I have my program in a branch and release that with make distcheck and lp-project-upload. The packaging is stored in another branch.
For each release I update the packaging with dch -i and add a new entry, e.g.
myproject (0.1.5-0ubuntu1) precise; urgency=low
* New upstream release:
- New exciting stuff
-- Me
I then run release.sh and this generates three source packages and uploads them to the PPA:
NAME=myproject
PPA=ppa:myteam/myproject
RELEASES="raring quantal precise"
VERSION=`head -1 debian/changelog | grep -o '[0-9.]*' | head -1`
ORIG_RELEASE=`head -1 debian/changelog | sed 's/.*) \(.*\);.*/\1/'`
for RELEASE in $RELEASES ;
do
cp debian/changelog debian/changelog.backup
sed -i "s/${ORIG_RELEASE}/${RELEASE}/;s/0ubuntu1/0ubuntu1~${RELEASE}1/" debian/changelog
bzr-buildpackage -S -- -sa
dput ${PPA} ../${NAME}_${VERSION}-0ubuntu1~${RELEASE}1_source.changes
mv debian/changelog.backup debian/changelog
done
Hope this is useful for someone!
Note I don't use source recipes as I want just a single package uploaded for each release.
Wednesday, November 21, 2012
Testing Kerberos in Ubuntu
In fixing a LightDM bug recently I needed to set up Kerberos authentication for testing. Now, Kerberos comes with quite a reputation for complexity so this was not a task I was looking forward to. And googling around to get some simple Ubuntu instructions only ended up confirming my expectations. But in the end, I was able to get it to work [1] and here is what I did. You should probably not rely on this information for an actual Kerberos implementation.
I start with two machines running Ubuntu, one as the Kerberos server [2] and one as a client. The client is already installed with a user account called test.
I start with two machines running Ubuntu, one as the Kerberos server [2] and one as a client. The client is already installed with a user account called test.
Server configuration
Edit /etc/krb5.conf to set the default realm [3]:
[libdefaults]
default_realm = TEST
Install the Kerberos server:
$ sudo apt-get install krb5-kdc krb5-admin-server
Create the realm. You will be prompted for a master password for the realm:
$ sudo krb5_newrealm
Add a new user (called a principal in Kerberos language) into the realm with the same username as on the client. You will be prompted for a password for this user [4]:
$ sudo kadmin.local
kadmin.local: add_principal test
And now the server should be running. You can check things are working by watching the log:
$ tail -f /var/log/auth.log
Client configuration
The client is a lot easier, as the packages do most of the work for you:
$ sudo apt-get install krb5-user
You will be prompted for the following information:
- Set "Default Kerberos version 5 realm" to TEST
- Set "Kerberos server for your realm" to address / hostname of your server
- Set "Administrative server for your Kerberos realm" to address / hostname of your server
$ kinit
$ kdestroy
If that worked then you're ready to go. Have a look at the auth.log on the sever if it didn't work (the error messages are a bit cryptic though).
The next step is to setup PAM [6] to allow authentication with Kerberos. There's no configuration required, just install it:
$ sudo apt-get install libpam-krb5
Now you can log into your client machine (e.g. from LightDM/Unity Greeter) using the Kerberos password you setup on the server. Remember if something went wrong you can still use the local password to get in [7].
The reason I set all this up was to test Kerberos accounts which need password changes. You can control this feature from the server using the following:
$ sudo kadmin.local
kadmin.local: modify_principal +needchange test
[1] on Ubuntu 13.04 (server) and 12.04 (client). I don't know which other combinations will work.
[2] Called a Key Distribution Centre in Kerberos jargon.
[3] Kerberos calls different authentication domains realms. I've used the realm TEST though in proper usage this would be a domain name e.g. EXAMPLE.COM to avoid name collision.
[4] You will already have a password set for this user on the client machine. Pick a different password as this allows you log in with either Kerberos or local passwords - both passwords will work.
[5] A ticket is the name for an authentication token provided by the server. In a real implementation this ticket will allow you to access services without re-entering your password.
[6] PAM is the library that does authentication when logging into Ubuntu.
[7] The PAM configuration that the packages setup first tries your password with the Kerberos server, then the local passwords (/etc/shadow) if that fails.
Thursday, February 09, 2012
So You Want to Write a LightDM Greeter…
Matt Fischer wrote a great post about writing a greeter for LightDM. Runs through an example of a Python greeter and how it works.
Thursday, December 08, 2011
Gnome Games Modernisation
The GNOME Games project maintains fifteen small "five-minute" games for the GNOME desktop.
Unfortunately over time the games have struggled to keep up with the latest GNOME technology due to the time required to do this. And the further behind we've got the harder it is for new developers to get involved as the code is hard to work with.
So the time has come for a great modernising. And here's where you fit it :)
We've picked eight of the games we think are the best and we want to focus on bringing them up to modern standards. The games are Chess, Five or More, Mines, Iagno, Mahjongg, Sudoku and Swell Foop. These games all have been or are in progress of being ported to Vala. Vala is a modern programming language that will be familiar to anyone who has used Java or C#.
We have a Matrix of things to do:
with the goal being to turn everything to green.
So, if you're interested in helping out follow any of the bug links and start fixing the bugs! All the tasks should be able to be completed independently and shouldn't be too complex to achieve. Anyone is welcome to attempt these and there are non-coding tasks (documentation and design).
Unfortunately over time the games have struggled to keep up with the latest GNOME technology due to the time required to do this. And the further behind we've got the harder it is for new developers to get involved as the code is hard to work with.
So the time has come for a great modernising. And here's where you fit it :)
We've picked eight of the games we think are the best and we want to focus on bringing them up to modern standards. The games are Chess, Five or More, Mines, Iagno, Mahjongg, Sudoku and Swell Foop. These games all have been or are in progress of being ported to Vala. Vala is a modern programming language that will be familiar to anyone who has used Java or C#.
We have a Matrix of things to do:
with the goal being to turn everything to green.
So, if you're interested in helping out follow any of the bug links and start fixing the bugs! All the tasks should be able to be completed independently and shouldn't be too complex to achieve. Anyone is welcome to attempt these and there are non-coding tasks (documentation and design).
Saturday, September 10, 2011
GNOME OS
There was a very good interview with Jon McCann recently about GNOME 3 and GNOME OS. Reading public comments on this interview showed a lot of negativity which I think missed the good points.
GNOME OS is unfortunately very loosely defined, but from what I can gather it's essentially controlling the entire stack that GNOME is to make a better experience and make it easier to work on the project.
What I like about this strategy:
GNOME OS is unfortunately very loosely defined, but from what I can gather it's essentially controlling the entire stack that GNOME is to make a better experience and make it easier to work on the project.
What I like about this strategy:
- It's focuses on the users that GNOME is targeted at. We've had a strong direction since GNOME 2 days and the focus on features that these users need through design is the right way of getting there.
- It's dropping old desktop metaphors and moving to new ones. There are other desktops like XFCE which will continue the Windows 95 desktop metaphor and be successful with it; it's right for GNOME to move on and be more cutting edge.
- It downplays the value of GNOME as a "box of bits". The drum I'm banging at the moment is about sharing infrastructure. This is something GNOME has been very successful with in the past and discouraging this cuts off a lot of places where GNOME can get investment from other projects.
- It puts very strong requirements on distributors which they don't want to / can't meet. GNOME is not like Apple, it can't control the entire stack from hardware to sales. It needs to work with distributors or have a distribution strategy. Building a perfect desktop is not enough.
Desktop common ground
There's a common argument that you hear about open source desktops which goes something like "we have less than 1% market share; the other desktops are laughing at us; we should pool together and make a real contender".
And you know what, they're right! We don't have a significant market share, and we're not at the point where we have a truly amazing desktop experience (but we're getting closer). Anything we can do to get there faster must be a good thing.
And you know what, they're wrong. We don't have a finite developer resource. Open-source is amazing like that - when a project starts that people care about suddenly the community grows. Trying to mash everyone together into one project wouldn't work and would probably make us even slower. Putting all our eggs in one basket is a big risk.
How can we share resources to grow that market share without pushing us together into a big compromise? We need to share infrastructure. What we need is a POSIX for the 21st century. We've been slowly building this with things like the Linux, D-Bus, X, GStreamer. We can do more.
There's been a rise in design thinking in Open-Source which has been really good for everyone. But I think the pendulum has swung too far. User experience is not the only factor in deciding what to do. Infrastructure is expensive. Every bad API is slowing us down progress on the layers above it. Every desktop developer that dives into infrastructure is not working on those layers either.
Sharing is hard. But the cost of not sharing is huge. Lets make sure the infrastructure we're building for tomorrow works cross-desktop and we can share those costs.
And you know what, they're right! We don't have a significant market share, and we're not at the point where we have a truly amazing desktop experience (but we're getting closer). Anything we can do to get there faster must be a good thing.
And you know what, they're wrong. We don't have a finite developer resource. Open-source is amazing like that - when a project starts that people care about suddenly the community grows. Trying to mash everyone together into one project wouldn't work and would probably make us even slower. Putting all our eggs in one basket is a big risk.
How can we share resources to grow that market share without pushing us together into a big compromise? We need to share infrastructure. What we need is a POSIX for the 21st century. We've been slowly building this with things like the Linux, D-Bus, X, GStreamer. We can do more.
There's been a rise in design thinking in Open-Source which has been really good for everyone. But I think the pendulum has swung too far. User experience is not the only factor in deciding what to do. Infrastructure is expensive. Every bad API is slowing us down progress on the layers above it. Every desktop developer that dives into infrastructure is not working on those layers either.
Sharing is hard. But the cost of not sharing is huge. Lets make sure the infrastructure we're building for tomorrow works cross-desktop and we can share those costs.
Friday, September 02, 2011
Desktop Summit 2011
Last month I attended the Desktop Summit 2011 in Berlin. Unfortunately I was only there for the core days because Berlin is an awesome city and the summit is awesome too.
The quality of the talks this year were great, and I only had one or two slots the whole time where there was nothing I wanted to go to. This summit felt more integrated than the last one and I hope this continues into the future.
Some highlights:
The quality of the talks this year were great, and I only had one or two slots the whole time where there was nothing I wanted to go to. This summit felt more integrated than the last one and I hope this continues into the future.
Some highlights:
- There was a good response to LightDM. I felt my talk had a lack of GNOME people present, but I think the GTK4 talk may have absorbed them.
- Lennart did a well researched talk on revamping the login system which sounds very good and left me wanting to know more.
- From what I heard the future of GTK+4 and Clutter looks very promising, but I haven't been able to see the talk as I was doing mine. Can't wait for the videos to come out so I can find out more.
- Vincent Untz is did a really thoughtful talk on his experiences as a GNOME release manager.
- GNOME Shell seems to be progressing very well and there were a lot of talks on it.
- Plasma/KDE also seems to be doing a lot of innovation.
- There was basically no mention of Unity.
- There was the usual amount of Canonical bashing and it's not helping anyone. The GNOME State of the Union had too many cheap jabs and the half hearted laughter shows it's just not funny anymore.
- There was a lack of Canonical people present, and it was commented on numerous times. I'm personally not surprised, as every year more of my colleagues just don't want to be there. Andrea Cimitan, who is a great guy, summed it up when he said on Google+ "when I say around here I'm working for "Canonical", people stop smiling :)".
- There was little mention of GNOME OS. Sometimes we need to be more than just hackers talking about technology and really talk more about planning and strategy.
- Increased visibility of other desktops - it still feels very GNOME and KDE centric, I think we can learn a lot from projects like XFCE, LXDE, Elementary, Unity etc.
- Increased collaboration on infrastructure - we need to get freedesktop.org in a better shape so we can pool out resources on the boring stuff and focus more on the user facing component which make us successful.
Thursday, September 01, 2011
Broken PDFs in Simple Scan
Since version 2.32 Simple Scan has had a bug where it generates PDF files with invalid cross-reference tables. The good news is this bug is now fixed, and will work correctly in simple-scan 3.2; thanks to Rafał Mużyło who diagnosed this. You may not have noticed this bug as a number of PDF readers handle these types of failures and rebuild the table (e.g. Evince). It was noticed that some versions of Adobe Reader do not handle these failures.
I've added a command line option that can fix existing PDF files that you have generated with Simple Scan. To use run the following:
It should be safe to run this on all PDF documents but PLEASE BACKUP FIRST. It will copy the existing document to DocumentName.pdf~ before replacing it with the fixed version so you have those in case anything goes wrong.
If you can't wait for the next simple-scan, you can also run this Python program (i.e. python fixpdf.py broken.pdf > fixed.pdf)
I've added a command line option that can fix existing PDF files that you have generated with Simple Scan. To use run the following:
simple-scan --fix-pdf ~/Documents/*.pdf
It should be safe to run this on all PDF documents but PLEASE BACKUP FIRST. It will copy the existing document to DocumentName.pdf~ before replacing it with the fixed version so you have those in case anything goes wrong.
If you can't wait for the next simple-scan, you can also run this Python program (i.e. python fixpdf.py broken.pdf > fixed.pdf)
import sys
import re
lines = file (sys.argv[ 1]).readlines ()
xref_offset = int(lines[-2])
xref_offset = 0
for (n, line) in enumerate (lines):
# Fix PDF header and binary comment
if (n == 0 or n == 1) and line.startswith ('%%'):
xref_ offset -= 1
line = line[1:]
# Fix xref format
match = re.match ('(\d\d\ d\d\d\d\ d\d\d\d) 0000 n\n', line)
if match != None:
offset = int (match.groups ()[0])
line = '%010d 00000 n \n' % (offset + xref_offset)
# Fix xref offset
if n == len(lines) - 2:
line = '%d\n' % (int (line) + xref_offset)
# Fix EOF marker
if n == len(lines) - 1 and line.startswith ('%%%%'):
line = line[2:]
print line,
Wednesday, August 03, 2011
LightDM at the Desktop Summit
I'm going to the Desktop Summit and will be doing a talk on LightDM; please come along if you're interested in this project. It's at the same time as the GTK4 talk unfortunately, but I promise I'll watch that one on video and turn up to mine...
Thursday, May 19, 2011
Razing the Bazaar
Currently in GNOME there is some tension as we move into the post 3.0 world about the scope and direction of the project. I wont go into the details of this, but essentially a number of core developers are pushing for a future in which:
UPDATE: Changed description of project focus, as it is confusing the point of this post.
- The scope is widened to include more as part of core GNOME. This is to allow more control and integration to produce a better user experience.
- The project focus is being narrowed to have tighter requirements. This is to reduce support overhead and complexity.
- If we build the perfect OS in GNOME it will not be enough. History is littered with better products that fail to succeed. Making an OS successful is as much about the OS design and quality as the ability to deliver that OS to end users.
- If we base all our decision making on "what user visible change does this have?" then we risk losing innovation in our platform. End-users are only one type of user in an OS and not all changes are relevant to them. We have to think more in terms of "will this have a bad effect on end-users?" and look at other aspects.
- If we narrow our focus too much we risk losing some of our current community. The community is an enormous asset of GNOME, and not something we should take for granted. This is not a company, and is driven by motivated individuals (some of who are then employed by companies). There is great number of communities out there and GNOME needs to be competitive.
- If we try and control everything then we increase the burden of maintenance onto one project. There is no funding guaranteed to get us to GNOME 4. We should always look (within reason) for opportunities to collaborate with other communities.
UPDATE: Changed description of project focus, as it is confusing the point of this post.
Wednesday, March 09, 2011
And now for some good news
With all the doom and gloom blog posts running around at the moment you may be forgetting all the awesome progress that is being made. So I just wanted to shout out some things that are happening that I love:
GTK3
GTK+ has been cleaned up and it shows! GTK is a great toolkit but it had been showing its age. The tidying up (particularly removing the GDK stuff) has significantly reduced the learning curve. And more improvements planned for GTK4!
GNOME Shell/Unity
The core user interface is being pulled from the 1990s to the future! There are real risks and challenges here but it's progress in making GNOME the front-running interface it deserves to be.
GObject Introspection
No more out-of-date language bindings! With introspection information GNOME developers have huge flexibility in picking languages and all languages are first class citizens.
Vala
A modern language for a modern desktop! Languages like Java and C# offered a lot of promise, but never seemed to break into GNOME. A modern language makes us more productive, attracts new experienced developers and gives us an opportunity to escape from the Albatross around our neck (C).
GTK3
GTK+ has been cleaned up and it shows! GTK is a great toolkit but it had been showing its age. The tidying up (particularly removing the GDK stuff) has significantly reduced the learning curve. And more improvements planned for GTK4!
GNOME Shell/Unity
The core user interface is being pulled from the 1990s to the future! There are real risks and challenges here but it's progress in making GNOME the front-running interface it deserves to be.
GObject Introspection
No more out-of-date language bindings! With introspection information GNOME developers have huge flexibility in picking languages and all languages are first class citizens.
Vala
A modern language for a modern desktop! Languages like Java and C# offered a lot of promise, but never seemed to break into GNOME. A modern language makes us more productive, attracts new experienced developers and gives us an opportunity to escape from the Albatross around our neck (C).
Monday, December 06, 2010
Brainstorm Idea #25877: GNOME System Monitor lacks in-depth information
One of the popular Ubuntu Brainstorm ideas is to improve GNOME System Monitor. This has been reported to the upstream project, and the project developers agree with the concept.
There is a proposed design: (no attribution as it's not clear who made this)
How can you get involved?
If you have some coding skills, then consider making a patch to fix this! This is a well-defined feature request, and should be relatively easy to get to work. Here's how:
There is a proposed design: (no attribution as it's not clear who made this)
How can you get involved?
If you have some coding skills, then consider making a patch to fix this! This is a well-defined feature request, and should be relatively easy to get to work. Here's how:
- Comment on the bug that you are interested in working on this. Ideally the GNOME System Monitor developers will be able to help you, but I am also watching the bug and willing to help out.
- Get the dependencies required to build the GNOME System Monitor. On Ubuntu this is as easy as: sudo apt-get build-dep gnome-system-monitor.
- Check out the upstream: git clone git://git.gnome.org/gnome-system-monitor.
- Build and test it: ./autogen.sh --prefix=`pwd`/install ; make ; make install ; ./install/bin/gnome-system-monitor.
- Make some changes and repeat from step 4 until everything works.
- When you are done, do a git commit -a; git format-patch origin and attach the patch to the bug report.
Thursday, November 11, 2010
Using the latest SANE drivers in Ubuntu 10.04, 10.10
If you are running Ubuntu 10.04 LTS or 10.10 and your scanner is not supported, then you can try the latest releases of the SANE drivers from my sane-backends PPA. The following steps from the terminal will enable it.
Please feedback to the SANE project if you continue to have problems!
$ sudo add-apt-repository ppa:robert-ancell/sane-backends
$ sudo apt-get update
$ sudo apt-get upgradePlease feedback to the SANE project if you continue to have problems!
GNOME3 PPA
In the Ubuntu Desktop team we're currently packaging GNOME 3 components for Ubuntu as per the blueprint decided at the last Ubuntu Developer Summit.
If you're already brave enough to be running Natty, then you can additionally try some new GNOME 3 applications by adding the GNOME3 builds PPA into your sources. Expect the usual - the packages may not work perfectly, and it's non-trivial to downgrade, so be warned!
We're going to evaluate these packages in the PPA and decide how many are appropriate to include in Natty.
If you're already brave enough to be running Natty, then you can additionally try some new GNOME 3 applications by adding the GNOME3 builds PPA into your sources. Expect the usual - the packages may not work perfectly, and it's non-trivial to downgrade, so be warned!
We're going to evaluate these packages in the PPA and decide how many are appropriate to include in Natty.
LightDM status update
It must be time to update on how progress is going with LightDM.
- PCMan from LXDE wrote an awesome new greeter that uses GtkBuilder for layout. This allows you to easily theme up new greeters, the default one being the old Industrial theme:
- There's been some interest in writing a QT based greeter, and I hope we'll be able to show that working soon.
- I've started to write some documentation.
- I non-proposed LightDM for inclusion in GNOME 3 (I don't feel it will be sufficiently ready in time). This raised the idea of should we switch display managers in the future and got some good feedback.
- We had a UDS session on LightDM which also gave some good feedback. The outcome is that I plan to make LightDM easily installable in Ubuntu 11.04 as an alternative display manager.
Friday, October 29, 2010
How I Learned to Stop Worrying and Love the Vala
When I first heard of Vala I was not impressed. A domain specific language? That seems like a dead-end; how would we every attract developers to the GNOME platform?
After a while of thinking about it I realised that GNOME already was in this position. GObject+C already is a domain specific language. New developers already have to learn GObject and finding developers who both are proficient and willing to use C can be a struggle.
Recently I have been working on more and more with Vala and porting applications from PyGTK/C to it. Before Vala, PyGTK had a lot of advantaged over C; now the main differences are easy debugging and fast development (Python) vs fast performance and type checking for easier maintenance (Vala).
So, my recommendation is if you have a desktop application that uses GObject APIs* and you are happy with debugging tools like gdb and valgrind then consider using Vala!
* I was working on porting LightDM to Vala but old system APIs were causing difficulty. If you have this case consider wrapping them in C+Gobject first and then interfacing to that.
After a while of thinking about it I realised that GNOME already was in this position. GObject+C already is a domain specific language. New developers already have to learn GObject and finding developers who both are proficient and willing to use C can be a struggle.
Recently I have been working on more and more with Vala and porting applications from PyGTK/C to it. Before Vala, PyGTK had a lot of advantaged over C; now the main differences are easy debugging and fast development (Python) vs fast performance and type checking for easier maintenance (Vala).
So, my recommendation is if you have a desktop application that uses GObject APIs* and you are happy with debugging tools like gdb and valgrind then consider using Vala!
* I was working on porting LightDM to Vala but old system APIs were causing difficulty. If you have this case consider wrapping them in C+Gobject first and then interfacing to that.
Subscribe to:
Comments (Atom)


