On 23rd of April 2015, Mihai Șucan passed away due to metastatic
cancer caused by RDEB.
My name is Mihai and I work on the Firefox developer tools.
When it comes to web development, I like both server-side and
client-side work. I am mainly interested in web browsers, web
standards and related technologies.
6 June 2012, 10:35
Hello Mozillians!
For the Aurora update that's happening this week we have some major
changes under the hood for the Web Console.
Async Web Console
Work on making the Web Console UI async, decoupled from all the
error and network listeners, started in July last year (bug
673148). Even if it took almost one year to get this bug fixed,
I worked on these patches for about 3 months (loosely summing up
the total days of work spent on this specific bug).
Back in those months we strongly pushed for various developer
tools to land and get enabled by default in Firefox. I've been
working a lot on the source editor which was needed for the Style
Editor, the JS debugger and Scratchpad. Work on the Web Console was
on and off.
There was modest to good progress for the Web Console async
work until around September - October when Mozilla's electrolysis
project was re-prioritized. The initial work on the patch started
out with the goal of making the Web Console ready for e10s. When
priorities changed, I went back to source editor work which was
higher priority at that point.
In January - February a really brave and courageous contributor,
Sonny Piers, took the huge patch and rebased it. His efforts were
commendable given the size and complexity of the work that was
going on. Thank you Sonny!
In March I resumed work and I strongly focused on completing
the async patches. Last week the last patch landed in the nightly
builds of Firefox.
What changed? Most of the Web Console was
implemented in a single file, HUDService.jsm
. It had
everything - from UI code to all the error and network listeners
and stuff for the window.console
API. We have now
broken that code into separate scripts with the goal to leave
HUDService.jsm
as the script that implements only the
UI. The new HUDService-content.js
script implements
all the listeners, all the "backend stuff". The UI code must no
longer directly access the content window and objects from the
content document.
Why? This work allows us to move to the remote
debug protocol and to have the Web Console UI connect to your
Firefox Mobile or B2G device where all the error and network
listeners are instanced. This work paves the way to a remotable Web
Console.
The added benefit is that the async-ness had some modest
performance benefits to content scripts (pages) that used the
window.console
API - a call to any method no longer
had to wait for the Web Console UI parts to update.
In the future other Firefox components and extensions can build
different UIs on top of the data collected by the
HUDService-content.js
script.
Lessons learned:
- Focus, focus and focus! Given big projects one must not try to
do everything else.
- Do not underestimate the time it takes to polish working code,
to make it ready for review. I had working code in
September-October, but getting it "done" took quite more.
- Make sure your manager is aware there's a ton of work to do on
your project. There's high temptation to be nice and be helpful and
do a lot of other work in between. ;)
-
Aggressively split your work into smaller chunks.
- Be lazy - avoid doing work you don't need to do for the given
goal project.
Thanks go to Rob Campbell, Dave Camp, Felipe Gomez, Ms2ger, Joe
Walker, Sonny Piers and everyone else who contributed to getting
these patches to be ready to land.
Improved performance
Building on top of the async Web Console work we've also made some
really nice output performance improvements (bug
722685). In bug
746869 Boris Zbarsky analyzed the performance issues in our
code and he made a number of valuable suggestions on how we can
make it faster. Thank you Boris!
Our first attempt to make the Web Console output faster has landed
in Firefox. Let's go straight for the numbers:
-
Opera 12 (post-beta, latest snapshot, with "cutting-edge"
Dragonfly):
-
Chromium 18 (beta):
-
Closed console:
- Simple string: 21 ms
- Interpolation: 11 ms
-
Open console:
- Simple string: 66 ms
- Interpolation: 68 ms
Performance in content pages is very good. However, display
performance is actually poor. First run is fast. Subsequent runs
take far more. The web inspector tool UI is frozen for many seconds
when the second and third runs happen. Content process separation
helps a lot. Even if Web Inspector's display is frozen, web pages
continue to run smoothly.
-
Firefox 13 (without the async patches):
-
Firefox 15 nightly (with the async patches landed):
-
Firefox 15 Aurora (with the performance patch landed):
-
Closed console:
- Simple string: 50 ms
- Interpolated string: 48 ms
-
Open console:
- Simple string: 51 ms
- Interpolated string: 48 ms
For comparison: do note that 1000 dump()
calls take
around 10-20 ms in Firefox. (dump()
is a dumb method
we use to output messages to STDOUT.)
Having the Web Console open or closed no longer directly impacts
console API calls. Now the UI no longer freezes and results show up
quickly.
I tried with 5000 calls and we now do better than Opera's Dragonfly
and Chrome's Web Inspector - in terms of UI updates. Still, console
API calls finish faster, for some reason, in those two browsers.
Please do note that I used the simple
test attached to bug
722685 for testing. These numbers are not meant to be
"scientific" or anything like that - they are based on my machine
setup.
We will continue to do further work in improving the output
performance (bug
761257). At this point we still need to avoid doing some
unneeded work when a lot of messages end up in the queue to be
displayed. We also need to better balance how often and how many
messages we display during "heavy fire" - during the execution of
content scripts that invoke the console API methods many, many
times for an extended period of time.
What's next?
We have plans to move the Web Console UI into its own
<iframe>
, change the UI to match the other
developer tools theme, add the option to move the UI into a real
window, make a global console that could replace the Error Console
and, obviously, switch to the remote debug protocol so you can use
the Web Console with remote Firefox instances. All this and many
other improvements, of course!
You may wonder "when?" and the answer to that is that all the
improvements will come gradually when we get to implement them.
File bugs, find regressions and let us know what you like and
dislike! Thank you!
Published in:
aurora,
devtools,
firefox,
mozilla,
performance,
web console.
2 July 2009, 19:52
Hello everyone!
Since my last blog post I have completed the user interface
polishing for PaintWeb: the Color Mixer
and the Color Picker are both working fine now.
Today I have completed work on packaging. I also generated the
complete
API reference documentation from the source code.
You can go and play with the PaintWeb
demo at the usual location.
For packaging I use a Makefile, YUICompressor,
jsdoc-toolkit, PHP and some
bash scripts. First of all, I merge all the JavaScript files into a
single file. I also merge the XHTML interface layout inside the
JavaScript - for this I use a small PHP script which encodes the
string using json_encode()
. Once I have the hefty
script, I use the YUICompressor tool to make it a lot smaller.
For the PaintWeb interface stylesheet I use the YUICompressor in
combination with a simple PHP script I wrote. The PHP script
inlines the PNG images using data URIs. This
helps a lot in reducing the number of elements being downloaded.
Here are the numbers, for those curious of the packaging results.
Before packaging:
- 18 JavaScript files, 426.6 KB;
- Three JSON
files, 33.9 KB;
- One XHTML file, 14.9 KB;
- One CSS file, 21.8
KB;
- 47 images (PNGs), 206.5 KB;
- A total of 70 files, 703.7 KB.
That's quite much. Here's what the current level of packaging gives
us:
- Two JavaScript files, 130.7 KB - one of them, json2.js, is only 3 KB and is not
always loaded;
- Three JSON files, 33.9 KB. The JSON files are left untouched,
the configuration example stays the same - with all the comments in
it. It's up to the integrator to choose what he/she does with the
file (at the moment).
- One CSS file, 297.1 KB - with all the images inlined;
- A total of only 6 files, worth of 461.7 KB.
That's better, but there's room for more. You should always enable
gzip compression on your Apache
server. Here's what a properly configured server can give you:
- Two JavaScript files, 35 KB;
- Three JSON files, 8 KB;
- One CSS file, 99 KB;
- A total of six files, and only 142 KB.
That's much better now. To properly configure your server, make
sure you enable gzip
compression in your .htaccess file:
<IfModule mod_deflate.c>
<FilesMatch "\.(js|css|json|html)$">
SetOutputFilter DEFLATE
</FilesMatch>
</IfModule>
If you are curious how fast PaintWeb loads, I added a timer in the
demo script - you can take a look in a JavaScript console in your
Web browser. On my local system it takes less than a second,
depending on the browser I use. Go ahead and try
PaintWeb yourself. Also make sure you check out the API
reference.
In the coming days I will be publishing guides on PaintWeb
development, extensibility and general code overview. This means
Moodle integration is ready to
begin!
Published in:
apache,
api,
documentation,
gsoc2009,
makefile,
moodle,
paintweb,
performance,
yui.
28 May 2009, 11:55
Hello everyone!
This week I have completed my work on performance testing and
improvements for PaintWeb on the OLPC XO laptop.
During testing it became obvious that something other than the
actual Canvas painting was very slow on the XO. The main
performance culprit is that the default Gecko-based
browser is configured to render pages using 134 DPI instead of the default 96 DPI. Generally
web browsers render pages using 96 DPI. If the XO web browser would
do the same the texts and the images would be far too small - the
XO display is an
odd 200 DPI screen perceived as 134 DPI.
PaintWeb's drawing performance was hugely affected by the bilinear
scaling of the Canvas elements being done by the browser on the XO.
When I configured the browser to render the page using 96 DPI, the
web application became a lot more responsive.
Martin
Langhoff, my mentor, got in contact with Robert O'Callahan from
Mozilla. He provided us with lots of help in finding a solution for
the performance issue.
We did think about having a CSS property to change the DPI only
for the Canvas elements, or a different CSS property to disable
scaling, or some proprietary API for changing the DPI on a single
page. None of these are good ideas, because they allow web
developers to start coding for specific DPIs - this is not desired.
Gecko scales pages using integer scaling factors - that's 1, 2, 3,
etc - it doesn't use floating point numbers. In a normal Gecko
build the scaling factor for 134 DPI is 1 - because 134 / 96 = 1,
so you do not get any scaling. You can have a scaling factor of 2
or higher if you go above 192 DPI.
Gecko
is patched on the XO in a way that it forces the browser to
scale pages using floating-point scaling factors as well.
Therefore, for 134 DPI the pages are scaled and they look really
good on the XO screen.
The final solution which I implemented into PaintWeb is to simply
scale down the Canvas elements in my document. If I accurately
scale down the elements, Gecko is sufficiently optimized to cancel
any scaling and you do not notice any performance impact. This
works really great.
In Gecko 1.9.1 (Firefox 3.5) I can detect the
DPI used for rendering the page with CSS 3 Media Queries.
I use this in PaintWeb. However, the XO only has Gecko 1.9.0 for
now, so I cannot determine the DPI. I am forced to do user agent
sniffing to check if the browser runs on the OLPC XO. If it does,
then I scale down the Canvas elements using a different way of
calculating the scale-down factor - because Gecko is patched - and
I always consider the page is render using 134 DPI. Fun, huh? ;)
On Opera, on the XO, I did all my
testing using 100% zoom level. It ran much better than Gecko, for
obvious reasons (no scaling, yay). Once I fixed the Gecko scaling
issue, Opera came second. For some reason Canvas draws much faster
in Gecko than in Opera on the OLPC XO.
Opera cannot render pages using different DPI values other than 96.
People use zoom, so, for consistency, I use an old trick to
measure the zoom level (thanks Arve). Based on this I scale
down the Canvas elements. For some zoom levels, like 200%, the
scaling is cancelled and PaintWeb works better. Unfortunately,
Opera does not allow non-integer pixel values, thus the
scaling-down is generally not effective...
Another important performance improvement in PaintWeb is the use of
timer-based canvas drawing. This means that mouse move events are
either cancelled or coalesced into one. For example, redrawing a
polygon with a lot of points for every mouse move is very slow. The
tools in PaintWeb use timers to update the canvas every few
milliseconds. This approach makes PaintWeb feel faster.
Lastly, I now avoid going into the global scope, for things like
Math.round
or such. The importance of this change is
reduced by the fact the JavaScript that runs is not very intensive
- not too much code is executed for each mouse move event. Such
changes become more important the more code you run. This will be
important for the color space visualization I have.
The loading performance will improve greatly once I will make a
packager for PaintWeb. Additionally, I will continue to constantly
check the overall performance of the web application on the OLPC
XO.
Go ahead and try PaintWeb from SVN
trunk. Lots of thanks to Robert for his great help and to Martin
for his assistance and for finding the Gecko patches.
Currently I am working on the new user interface, stay tuned!
Update May 31, 2009: Just published a page on the
OLPC wiki about the HTML Canvas
performance on the OLPC XO laptops. The page includes code
snippets explaining how to work-around the scaling issue.
Published in:
canvas,
css,
dpi,
gecko,
gsoc2009,
moodle,
olpc,
opera,
paintweb,
performance.
14 May 2009, 11:09
Hello everyone!
I have been working on the PaintWeb code refactoring
and now I am nearing completion. The initial PaintWeb 0.5 alpha
code was more of a demo - it was all in a single big script. I have
now added jsdoc comments almost everywhere and I did split the code
into multiple files - per tools, per extensions, per language, and
more. I have also made important changes to the API. Now any external
code can easily add/remove tools, extensions and keyboard
shortcuts.
For more developer-related information please read the latest forum
thread I posted on the Moodle forums.
For teachers and potential users of PaintWeb inside Moodle, I have
prepared a list of questions on how
you would use the paint tool in Moodle.
Martin, my mentor, suggested early in my GSOC application process to also
apply for the OLPC Contributors program. So I
did, and my project was accepted.
Even if the OLPC XO has a slow CPU by today's expectations, it's
only 400 Mhz, the system works quite nicely. It has 256 MB of RAM
and 1GB of disk capacity. The Sugar interface and the activities
provided are amazing. People who hear about these laptops do not
know to appreciate the numerous doors such laptops open, doors to
knowledge, for all those children who receive them. They help a lot
in learning about computing, maths, music, and more.
The Sugar interface is quite well thought-out. I like the concept
of having the neighbourhood, group, home and activity views.
The default browser, is some Python application embedding Gecko -
on par with Firefox 3.0. The performance of the browser is lacking.
Opera 10 alphas
start much faster and feel snappier. The paint tool feels sluggish
as well.
The Gnash plugin
is more of a problem rather than a solution. I installed Flash
Player 10, which is sluggish, but at least it works. The system can
play Youtube high-quality videos and even uncompressed DVD videos,
with Mplayer over the
wireless connection. Flash Player cannot play Youtube videos.
Battery life is good - I can use it about three hours without any
problems.
Since last week I have been working on the performance of the
PaintWeb application, with the OLPC XO-1 laptop. After several
tests, I have managed to improve things sufficiently such that the
paint tool is now usable in Opera 10 on the XO. Unfortunately, in
Browse.xo it's not, at least not by default.
The main performance culprit affecting PaintWeb on the XO is their
use of layout.css.dpi. Gecko allows users to change the
DPI used for rendering Web
pages, in order to makes fonts and images smaller or bigger. So, on
the XO the browser is set to use DPI 134, instead of DPI 96. This
makes the fonts and images render bigger - with DPI 96 they would
all be way too small. PaintWeb and all the pages feel much slower
because Gecko performs bilinear image resampling.
When I set layout.css.dpi to 96, drawing in PaintWeb
becomes real-time. I was amazed to see it works so well. It's like
on my desktop computer. And ... it's even faster than in Opera 10.
;)
If you want, check out the
performance tests yourself. Spoiler: Webkit is the fastest and
Gecko is the slowest when running synthetic tests. Obviously, more
performance tests will come - these are only limited to the pencil
tool and to the main ev_canvas()
event handler from
PaintWeb.
Next on my of list things to do is a new GUI and a packager for the entire
code. Loading PaintWeb is now slower due to the amount of code
comments and the increasing number of files. The packager will
compress/minify all the files into a single one.
That's all for now. Any feedback is welcome!
Published in:
canvas,
css,
dpi,
gecko,
gsoc2009,
moodle,
olpc,
opera,
paintweb,
performance,
webkit.
4 November 2005, 09:28
Hello!
Yesterday I had to install Windows 98 on a really slow computer by
"today's standards", a Pentium I, with only 32 MB.
The computer is a bit unstable. During system installation had some
BSODs (probably corrupted RAM and/or damaged mother board).
This guy will have a broadband Internet connection.
As a browser, I, of course, excluded Internet Explorer as an
option.
I got Opera 8.5 with all settings to the minimum (no skin, no
smooth zoom, no smooth scrolling, no special effects, nothing).
Booted and worked really fast, loved it ;). I wasn't expecting
that. Yet, it causes BSODs on Opera's own site and some other sites
(told you the computer is unstable!).
Now, I had to give Firefox 1.0 a try, hoping it's not slower and
won't cause as many BSODs
as Opera.
Yet, yesterday I finally saw for the first time the true speed
difference in start-up times between Opera and Firefox. Starting
Opera takes less than 3 seconds ... yet with Firefox ... you wait
and wait more :), from double to triple more time. Page rendering,
scrolling and overall browser usage is also slower (menus,
preferences, etc).
Sadly, there's nothing to configure in Firefox to really make it
faster. Also, Firefox crashed on few starts and on some sites (like
mine).
Conclusions:
- Firefox is not more stable than Opera (nor vice-versa). The
stability issues have been caused by the hardware.
- Opera is a lot faster. Really usable on such a slow computer.
The only problem of Opera is PNG rendering. On my site Firefox was
a tad faster :).
I actually managed to browse my site with Opera ... but Firefox
crashed :).
P.S. This is not an "Opera fan rant". It's clear to me now which
browser is faster: Opera. Those who really want to know which
browser is faster got to try them with a really slow computer.
Published in:
firefox,
opera,
performance.