Category Archives: Miscellaneous

The ACM and IEEE can be considered Finnish

On November 1st of every year, the Finnish government publishes everyone’s salary. It’s a nice leveler of the playing field for workers, as they can see if they’re paid fairly and what to expect at another company.

In the U.S. we generally know elected officials’ salaries (president $400,000, senator $174,000, etc.) but not much else. Except for non-profits, who have to file public tax returns. Update: Angelo Pesce pointed out this useful site.

I’m doing my taxes this weekend. To avoid that time-sucking task – one that takes only five minutes in The Netherlands – I decided to go look up the ACM’s and IEEE’s forms. It takes a bit of searching, but it’s interesting to see where the money goes. To start, here are some ACM salaries from their 2015 return (actual return here):

Half a million for the outgoing Chief Executive Director – not bad. I’ve asked around, and on one level this amount is a bit shocking, but it’s evidently (for good or ill) the norm for non-profits of this size.

Putting this info on the blog makes me feel a bit embarrassed; it’s breaking a social norm here, revealing a salary. But, it’s public knowledge! We’re now used to services such as Zillow to see someone’s property value – something we would have had to work hard to do back in 2005 (e.g., go to some government office and look up the deed). However, in the U.S., knowing someone’s salary is usually not something you can look up and is pretty taboo. So, cheap thrills, and it’s easy to do so now for at least a few people.

Wading through the rest of the return turns up various tidbits. For example, the ACM’s overall budget:

So they added about $6.7 million to their total assets in 2015, and ended the year with assets of:

I find these documents worth poking at, just to get a sense of what’s important. For example, the ACM makes its revenue as follows (see Part VIII for details):

Expenses, in Part IX, go on and on, with a large portion of the $60 million going to conferences:

Conferences raise $29 million each year (the revenue snippet), so I guess I conclude that conferences netted $5 million for the ACM in 2015. That cheers me to hear, vs. the opposite. Me, I’m curious how much the ACM Digital Library costs and how much revenue is raised, from individuals and institutions, but these numbers are not found here. I asked once back in 2012; the ACM doesn’t split out the DL income and costs from their other publication efforts.

There are lots of other tidbits in the return, but take a look for yourself.

Let’s go visit the IEEE – hmmm, wait, there are two of them. But both have small budgets, less than $7 million, so that’s not them. Searching a bit (quick, what does “IEEE” stand for?), I found them:

The Assistant Secretary and Executive Director (one person) gets $1.2 million – OK. The actual Director & Secretary doesn’t get paid at all, which I find entertaining somehow:

Now, the IEEE’s budget is a lot higher that the ACM’s, $436 million vs. $60 million:

It’s a different report format, so it’s not clear to me what assets they have.

It’s fun to poke around, e.g.:

ACM doesn’t seem to have any lobbying expenses. I wonder what the IEEE lobbies for – they’re really not spending much, but it’s more than zero. Better electricity? Lobbying is not in and of itself bad (I like it when the AMA lobbies against the use of tobacco just fine), but it’s interesting to see and great for forming unwarranted conspiracy theories.

OK, enough goofing off, this took way more time than I expected; back to my own taxes.

“Ray Tracing Gems” nears completion

Tomas and I turned over all our final files for Ray Tracing Gems to the publisher on January 2, and we’re gathering edits from the authors. The Table of Contents for the 32 articles is now public. The publisher’s webpage is up. There’s an Amazon page in progress (BTW, the after-the-colon title, “High-Quality and Real-Time Rendering with DXR and Other APIs,” was requested by the publisher to help search engines find the book).

The hardback book should be available at GDC and GTC, with a free electronic version(s) available sometime before or around then, along with a source code repository. Also, the book is open access, under this CC license. This means that the authors, or anyone else, can redistribute these articles as they wish, as long as it’s a non-commercial use and they credit the book as the source.

Here’s the cover, which should be on the other sites soon.

"Ray Tracing Gems" cover

By the way, if you want to read an article about ray tracing actual gems, this one is a good place to start. I happened upon it by accident, and it’s educational, approachable, and not dumbed down. The design criteria for a good gem cut are fun to read about: maximize reflected light as well as contrast, take into account that the viewer’s head will block off light, and so on. If you need a more serious paper from graphics people, there’s this article. Surprisingly, though fairly old, it is newer than any of the articles cited by the first, much newer article.

Atomic Age Photon Mapping

Well, not exactly Atomic Age, which I think of as late 40’s through mid 60’s, but close – what would you call it when you’re rendering on a plotter and drawing “+” signs for shading?

Arthur Appel’s paper from 1968 is considered the first use of ray casting for rendering, for eye rays and shadow rays.

One surprising part is that the paper includes the idea of shooting rays from the light and depositing the results on the surfaces – early photon mapping, or radiance caching, or something (details aren’t clear).

Update: an anonymous source let me know, “When I worked briefly at the IBM TJ Watson lab, I made a point of seeking out Art Appel. He was friendly and nice. As I recall, he said that he was working as technical support at the time of the ray tracing work. As he described it, most of his day was spent hanging around, waiting to answer the phone and help people with computer issues (this may have been a self-effacing description of a very different or important job, for all I know). He said that he had lots of free time during the day, and he was interested in using the computer to make images, so his ray tracing work was basically a hobby!”

 

SIGGRAPH 2018 Stuff

First, the links you need:

  • Stephen Hill’s wonderful SIGGRAPH 2018 links page. Note the fair number of presentations recorded and quickly put up.
  • Material from the High Performance Graphics conference preceding SIGGRAPH is all online. Hats off to them for doing this.
  • Matt Pharr guest-edited a special section of the latest issue (vol. 37, no. 3, June 2018) of ACM TOG, full of papers on how production renderers work.
  • Bonus link: this GDC 2018 link collection by Krzysztof Narkowicz; also, GDC 2014 and earlier by Javier “Jare” Arevalo (thanks to Stephen Hill for the tip-off).

Beyond all the deep ray learning tracing, which I’ve noted in other tweets and posts, the one technology on the show floor that got “you should go check it out” buzz was the Looking Glass Kickstarter, a good-looking and semi-affordable (starting in the $600 range, not thousands or higher) “holographic” display. 60 FPS color, 4 and 8 megapixel versions, but those pixels are divided up among the 45 views that must be generated each frame. Still, it looked lovely, and vaguely magical (and has Sketchfab support).

I mostly went to courses and talked with people. Here are a few tidbits of general interest:

  • Shadertoy is one of my favorite websites, but it takes too long to load and looks like it’s locked up. I learned that you can avoid this problem! Sign in, go to your profile, Config, and check “Use previews (avoid compilation times)”. The site is so much more usable now. Too bad it’s not the default, I expect because they want to show you cool things without clicking, but then end up not showing anything for awhile, e.g., 22 seconds for the popular page to compile on my fast workstation. Now this page shows up immediately, and I don’t mind that the animations don’t run when my mouse hovers over the image. (thanks, David Hart!)
  • Colin Barré-Brisebois mentioned in NVIDIA’s real-time ray tracing course that Schlick’s Fresnel approximation could not be used in their work, but I didn’t quite catch why. The notes don’t mention this – I need to follow up with him… message sent. Aha, he replies, “It was because of total internal reflection. The Schlick approx doesn’t handle it.”
  • One of the speakers in Path Tracing in Production mentioned in passing that some film frames took 300 hours of compute per frame, 1600 rays per pixel. Aha, it was for “Transformers: The Last Knight.” I recall Jim Blinn had some rule-of-thumb long ago about how frames will take 20 minutes no matter how much faster computers get and algorithms improve efficiency. I think that amount needs updating, maybe based on cost (after all, the computers he was using for computation back then weren’t cheap).
  • The PDF notes for this course were extensive, though the course lectures were not recorded (heavens forbid anyone capture an unauthorized bunny or chimp). It’ll be interesting once consumer body cams become a thing. Anyway, the notes capture all sorts of bits of wisdom, such as ways of finding structure to help denoise images (yes, film rendering companies use denoising, too), e.g., “we used the tangent vector of the fur to provide the denoiser with “normals” as it proved to have higher contrast than either the true normals of the fur, or the normals of the skin that the fur was spawned from.”
  • You know quaternions. David Hart noted a different algebra, octonions, which I’d never heard of before. Bunch of videos on YouTube.
  • Regrets: I missed the “Future Artificial Intelligence and Deep Learning Tools for VFX” panel, and there’s no video AFAIK. I wanted to go, because after Glassner’s intro to deep learning course (which was recorded, and well-attended & well-received), Doug Roble from Digital Domain showed me a little bit of their markerless facial mocap system, which looked great. He writes, “We’ll probably have some YouTube videos… soon.”

Finally, this, a clever photo booth at NVIDIA. The “glass” sphere looks rasterized, since it shows little refraction. Though, to make it solid, or even fill it with water, would have been massively heavy and unworkable. Sadly, gasses don’t have much of a refractive index, though maybe it’s as well – a chloroform-filled sphere might not be the safest thing. Anyway, it’s best as-is: people want to see their undistorted faces through the sphere. If you want realism, fix it in post.

Ray Tracing Monday

OK, everything happened today, so I am believing the concept of time is no longer meaningful.

First, NVIDIA announced its consumer versions of their RTX ray tracing GPUs, which should come as a shock to no one after last week’s Ray Tracing Monday at SIGGRAPH. My favorite “show off the ray tracing” demo was for Battlefield V.

Then, this:

https://twitter.com/Peter_shirley/status/1029342221139509249

I love free. To get up to speed on ray tracing, go get the books here (just in case you can’t click on Pete’s link above), or here (our site, which shows related links, reviews, etc.). Then go to the SIGGRAPH DXR ray tracing course site – there’s even an implementation of the example that’s the cover of Pete’s first book.

Up to speed already? Start writing an article for Ray Tracing Gems. At SIGGRAPH we found that a few people thought they had missed the proposals deadline. There is no proposals deadline. The first real deadline is October 15th, for completed articles. We will judge submissions, and may reject some, but our goal is to try to work with any interested authors before then, to make sure they’re writing something we’ll accept. So, you can informally write and bounce ideas off of us. We avoided the “proposals” step in order to give people more time to write and submit their ideas, large and small.

BTW, as far as free goes, we’re aiming to make the e-book version of Ray Tracing Gems free, and also having the authors maintain reprint rights for their works.

All for now. Day’s not over yet.

65.16% of the Figures in RTR4 are Now Downloadable

You can now download 440.5 out of the 676 figures in the fourth edition. Go get them here. (the 0.5 is because there are five figures where we include only half the figure, because the other half was from someone else.)

There are also now links on the home resources page of figures where we took a snapshot from a live demo that runs in a browser. For example, the three listed here are a fun place to start.

RTR4 Book Signing at SIGGRAPH

We’ll be signing books at the ACM bookstore on Tuesday, August 14, from 12:30 to 1 PM. All six of us should be able to make it, and with any luck there will be books, too (go, printers, go!). Even if not, we have a back-up plan… Oh, and do note it’s the bookstore, which is usually tucked away somewhere and is not on the exhibit hall floor.

Anyway, we hope to see you there!

Less Movable Targets

Here’s an update to my previous blog post, on the volatility of web links.

The Twitter post has a bunch of responses, with some useful tidbits in there. Some resources mentioned: HTML5 UP! for free CC templates; gamedev.net has been around for almost 20 years and can act as an archive, gamedevs.org keeps some old presentations around. Go three paragraphs down for some web hosting suggestions. The idea of using the archive.org link as the “real” link is clever (and a bit sad), but assumes archive.org will always be around. Note that publishers such as the ACM allow you to put your published articles up on your homepage, your institution’s website, and on non-commercial repositories. I’m not sure how entities such as ResearchGate (where I’ve seen a number of papers stored) fit into this picture – they appear to be for-profit, e.g., they sell advertising, so I don’t think they fall into any of the ACM’s categories. I appreciate their efforts, but am concerned that papers there may go away because ResearchGate hasn’t been challenged by the ACM or others. Again, long-term durability is a question.

Also see the comments after the original post. My comment on “The sooner these are displaced by open publications like the JCGT, the better” is that, in graphics, there are no other free (to both readers and authors) journals, at least none that I know about. arXiv maybe qualifies. Looking there today, this article seemed like a handy summary, pointing to some resources I hadn’t known of before. But, trying to go to a site they mention in their article, Chrome warns, “Attackers might be trying to steal your information from dgtal.org” – OK, never mind. There might be great stuff at arXiv, but it seems like a firehose (10 articles published in graphics in the last week), without serious peer review. Editorial filtering and peer review is worth a lot. I guess you might be able to use a strategy of putting your preprint at arXiv, sort of like ResearchGate but less questionable (arXiv is run by Cornell). This approach is underutilized within graphics, AFAIK: only 2 papers on our refs page are available this way, vs. 25 for ResearchGate. If someone wants to explain what I’m missing here, great! Update: the ACM now permits authors to put preprints on ArXiv.

Thanks to you all for the followups, and I find my thoughts about the same: corporations come and go, more quickly than we expect. While I have a lot of faith in various institutions, ultimately I think the entity that best looks out for my interests is me. Having my own domain and website is good insurance against the vagaries from change of job status, change of corporate services (or existence), and change of webmaster. Me, I’m a cheapskate: http://erichaines.com is just a subdomain of realtimerendering.com, of which I’m the prime webmaster; we also host a number of other groups as subdomains, such as the Advances in Real-Time Rendering course notes repository and Ke-Sen’s invaluable work tracking conference articles – doing so costs me no time or money, as others maintain them. So another option is to share a domain and host among a bunch of people.

Yes, your own website costs a little money (the price of two cups of Starbucks per month), but admit it: you pay more in a month for your smartphone and internet service provider than the yearly cost for a website. It’s a bit of effort initially to register a domain and set up a website, but once the template and blog are in place, you’re done. Write a new article or slide set, one that took you hours or weeks to create? It’s five minutes to add it to your web page and upload it. Morgan McGuire, Andrew Glassner, and I like bluehost. Sven Bergström likes digitalocean for $5/month hosting, and gives some setup and admin tips. His previous favorite was site5. Sebastian Sylvan likes nearlyfreespeech, which I hadn’t heard of and looks quite cheap for a personal site (like, possibly something like $3.65 a year (plus $12 per Gig stored, or maybe less – the pricing is not clear), with a free Gig download a day), assuming you’re not serving up huge files or don’t get popular; ijprest notes in the comments that Amazon’s S3 hosting is bare bones, just basic hosting, but about as cheap at nearlyfreespeech and is pretty much guaranteed to outlast you.

Update Nov. 2019: A few more options, just in case. Google Domains and Namecheap are cheaper still for domain name registration, with Namecheap sounding a bit less expensive (but we’re talking a few dollars a year here, tops). For free hosting, Github is another interesting option. The advantages include collaboration and automatic backup of any changes, a la Git. We use this for I3D, for example, with the site’s elements visible to all. For non-programmer-types there are plenty of other options.

Oh, and the presentation from 2012 I mentioned in my last post that is no longer available – dead link – is now available again, as Duncan Fewkes sent me a copy and Michal Valient gave me permission to host it. It’s now here – a few minutes work on my part.

Question for the day: if Gmail and Google Docs suddenly went away, would this cause a collapse that would take us back to the 1990’s, 1950’s, or would the loss kick the world all the way back to some time in the 1800’s? Just a thought, you might want to use Google Takeout or other backup method now and then. If nothing else, visiting your Google Takeout site is interesting in that you see the mind-boggling number of databases Google has in your name.

Moving Targets, and Why They’re Bad

Executive summary: if you write anything or show off any images, you should make a real website, both for yourself and for others. The followup post gives some resources for doing so.

We’ve been updating the Real-Time Rendering site (take a peek – you might at least enjoy the 4th edition cover). Today I’ve been grinding through updating URLs for the references in the book. Even though the book’s not yet out, you can see what articles we reference and jump to the article from this page.

Most of the articles can be found through using Google or Google Scholar. A few articles are trickier to find, or have a few URLs that are relevant – that’s the value I feel I’m adding by doing this laborious task. The other reason is for helping avoid link rot – I’ll explain that in a minute. Another is virus protection. For example, one blog URL, for the article “Render Color Spaces” by Anders Langlands, has had its domain anderslanglands.com (DON’T GO THERE (REALLY)) taken over by some evil entity in May 2018 and now leads to a page full of nastiness.

In going through our reference page today and adding links, doing so reminds me how tenuous our storage of knowledge is for some resources on the internet. Printed journals at least have a bunch of copies around the world, vs. one point of failure. I’ve noted this before. My point today is this: if you publish anything, go buy yourself a domain and host it somewhere (I like bluehost, as do Morgan McGuire and Andrew Glassner, but there are no doubt cheaper ways). All totaled, this will cost you maybe around $110 a year. Do it, if you care about sharing your work or are at all serious about your career (e.g., lose your job or want another? You now have a website holding your CV or work, ready to show). URLs have a permanence to them, vs. company-specific semi-hosting schemes such as Github or Dropbox, where the rules can and do change. For example, I just found a Github-based blog entry from Feb. 2017 that’s now gone (luckily still on archive.org). With some poking around, I found that the blog entry is in fact still on Github, but appeared to be gone because Github had changed its URL scheme and did not redirect from the old URL to the new one.

Once you have a hosted URL, look at how others arrange their resources, e.g., Morgan McGuire recently moved all his content from the Williams College website to his own personal site. Grab a free template, say from W3 Schools or copy a site you like. Put any articles or presentations or images or whatever that you want people to find on that site. Me, I’m old school; I use basic HTML with a text editor and FileZilla for transfers, end of story. Start a WordPress or other blog, which is then hosted on your site and so won’t die off so easily. Once you have a modest site up, you are now done, your contributions to civilization are available to everyone until you forget to renew your domain or pay for web hosting. Assuming you remember, your content is available until you’re both dead and no one else keeps up with the payments (another good reason to renew for the longest duration). Setting up your own website isn’t some ego-stroking thing on your part – some of the rest of us want continued access to the content you’ve provided, so please do keep it available. If your goal in writing is to help the graphics community, then allow your work to live as long as possible. “But my blog posts and whatnot have a short freshness ‘read by’ date,” you complain. Let us decide that; as someone who maintains the Graphics Gems repository, a collection of articles from 1990-1995, I know people are still using this code and the related articles, as they report bugs and errata to me. “I have tenure, and my school’s been around for 200 years.” So when you retire, they’re going to keep your site going?

Most of us don’t grab a URL and host it, which is a pity for all concerned. Most of the links I fixed today rotted for one of three reasons: the site itself died (e.g., the company disappeared; I now can’t find this talk from 2012 anywhere, and at least 14 other sites link to it), the subdirectory on the site was deleted (e.g., for a student or faculty member no longer at the institution), or the URLs were reorganized and no redirection was put in place (and if you’re a webmaster, please don’t do this – take the time to put in some redirection, no matter how untidy it may feel to you). Some resources that still work are hanging on by a thread, e.g., three articles on our page are served up by FTP only. FTP! Update: see my follow-up post for where to find that 2012 talk now.

BTW, people have worked on how to have their sites outlive them, but so far I don’t know of a convincing system, one where the service itself is likely to outlast its participants. Some blog and presentation content does outlive its creator, or at least its original URL, as much of the internet gets archived by The Wayback Machine. So, for the virus-ridden anderslanglands.com site, the article I wanted to link to is available on archive.org. Jendrik Illner does something for his (wonderful) summary posts that I hadn’t seen before: each link also has a “wayback-archive” link for convenience, in case the link no longer works. You can also easily try such links yourself on any dead site by using this Chrome extension. With this extension active, by default a dead page will cause the extension to offer you to look on archive.org. Links have an average life of 9.3 years before they rot, and that’s just the average. You’re likely to live longer, so do your future older self a favor by saving them some time and distress: make a nice home for your resources now so you don’t have to later.

If you’re too busy or poor to host your own content, at least paste your important URLs into archive.org’s site (you can also use the “Save Page Now” option in the Chrome extension, if you have a lot of pages) and your content will get archived (though if it’s a large PDF, maybe not). However, content on archive.org is not included in Google searches, so articles there effectively disappear unless the searcher happens to have the original URL and thinks to use the Wayback Machine. Also, people may stop looking when they try your original URL and find, for example, a porn site (e.g., this archive.org graphics site’s original URL goes to one now). This won’t happen if you have your own URL and maintain it.

For longer-term storage of your best ideas, don’t just blog about a topic, submit it to a journal (for example, JCGT takes practical articles) or article collection book (e.g., GPU Zen series, Ray Tracing Gems) and so have it become accessible for a good long while. It is possible and reasonable to take good blog content and rework it into an article. Going through peer review and copy editing will polish your idea all that much more.

These ramblings reflect my (limited) view of the world. If you know other approaches or resources to combat any aspect of link rot, please do let me know and I’ll fold them in here and credit you. Me, I hate seeing information get lost. Fight entropy now. Oh, and please put a date on any page you put up, so the rest of us can know if the page is recent or ancient history. Blog entries all have dates; everything else should, too.

Update: see my next post for some followups and a bunch of inexpensive options for making your own site.