Two Tales of Perception

I just finished The Case Against Reality – didn’t love it. But it did have some tidbits about perception that were intriguing, such as the split-brain patient who was atheist on one side of the brain; the other, religious.

Reading it reminded me of stories about perception in two other books, which I want to pass on here. The first is from The Forest People, a lovely older book in which an anthropologist studies the Mbuti pygmy people, living with them for three years. Here he travels to grasslands with a friend from the tribe, who had never been outside the jungle before.

How are our own perceptions affected by our upbringing? How does more use of screen and virtual reality affect us? Perhaps it makes graphics easier, in some sense? I recall when physically based shading models started to replace Blinn-Phong, people complained that things didn’t look right with the new models, even though they were more realistic.

Next is a little experiment described in Incognito, which is the best bathroom book ever – there’s something new every few pages.

This makes me wonder a bit about lag & latency and how they’re best measured, or can be mitigated.

Anyway, both books are wonderful, and I had to resist adding the stories about the sacred drainpipe and about chicken sexing.

 

 

I3D 2020 Call For Papers

The I3D Call For Papers is up: http://i3dsymposium.github.io/2020/cfp.html

Key dates:
13 December 2019 Paper submission deadline
20 December 2019 Extension for re-submissions (see details above)
10 February 2020 Notification of committee decisions
24 February 2020 Camera-ready deadline for accepted papers
9 March 2020 Poster submission deadline
5-7 May 2020 Conference at ILM in the Presidio, San Francisco

 

 

LinkedIn Invites

I’m posting because I gave a lecture on ray & path tracing last Monday, and at the end gave a little career advice, at the request of the people running the class. One thing I ranted about was getting LinkedIn invites without any explanation. I did say to the audience, students, that they could ask me for a connection, if they wanted. I guess I didn’t make it clear that they, too, should add an explanatory note – “loved your lecture, you’re the best person on the planet” or whatever – as I then received two invites without any notes that I tracked down as being students at the lecture (and so accepted). Next time I’ll be clearer…

I get a lot of LinkedIn invites – I suspect most people do. My rule is I accept if (a) I clearly know you or (b) you work for the same company as I do or have some other obvious direct connection or (c) you added a little note as to why we should connect.

I see varying advice on this. LinkedIn itself blogs on the topic, saying not to connect to random people. But most of the people who want to connect are semi-random – they usually are interested in computer graphics. Some site with an icky (to me) URL of linkedinriches.com (with “$” for that final “s” on the website itself) says I should accept everything except the utter randos, which does have a logic to it – who really cares who connects? But, if I get a note from the inviter, I’ll go with the assumption that I know them somehow. And if I see I have a connection with someone, I’ll assume I can contact them, as we somehow know each other – I don’t want to be the rando if I DM them.

My own feeling is that if someone doesn’t know me and doesn’t spend half a minute to write me a sentence for why we should connect (I always do, when connecting with someone else I don’t know), then I’ll ignore the request. As LinkedIn says, such requests are indistinguishable, disingenuous, lacking creativity, or lazy. Am I missing something here?

Reply on Twitter, if you’re interested (sadly, spammers have led to us mostly turning off comments on this blog itself).

And if you did make a no-explanation invite and would like to explain why we truly should connect, great: email me, erich@acm.org (once upon a time I would not post my email address, but Gmail’s spam filter is quite effective). I currently see 35 pending invites, and you all look to be fine people (except you, Fred), so let me know why you want to connect.

I3D 2020 Location and Dates

Date: conference is May 5-7th 2020 (the venue was already booked for May 4th – I’ll let you puzzle out why). Call For Papers coming soon – due date likely in December (judging from past conferences).

Location: Industrial Light and Magic at the Presidio, San Francisco. Naty Hoffman did a lot of work to make this happen, and I’m super-excited that it will be there – pleasant buildings in a lovely location with cool memorabilia in a great city. And that last link is definitely worth clicking – Google Earth’s fun.

This is the I3D you don’t want to miss (especially once you submit your best work!).

Seven Things for September 23, 2019

Seven things:

NPR-Related book free until September 25

Passing it on – being interested in the topic, and being a packrat, I downloaded a copy. It is quite thorough, 163 pages long, by experts in the field.

From Mike Casey at “now publishers”:

I am pleased to announce that Foundations and Trends in Computer Graphics and Vision has published the following issue:

Volume 11, Issues 1-2
Line Drawings from 3D Models: A Tutorial
By Pierre Bénard and Aaron Hertzmann
https://www.nowpublishers.com/article/Details/CGV-075

Complimentary downloads of this article will be available until September 25, 2019 so you should be able to access it directly using the link provided.

If you don’t like getting on yet another mailing list, you could consider a temporary email address, e.g., https://temp-mail.org/en/

(thanks to Andrew Glassner for passing this email on to me.)

Seven Things for August 19, 2019

Here are some things that I found worthwhile:

  • The slides for Fresnel Equation Considered Harmful are worth a look. Two people at SIGGRAPH mentioned to me this was a good presentation, and they’re right. Among other things, it describes the surprising result that using “the right equation” but sampling it a simpler way (RGB instead of per wavelength, which almost no one rendering does) can give a worse result than various common approximations, such as Schlick’s.
  • PC Gamer’s “PC Graphics Options Explained” is surprisingly detailed, well illustrated, and quite readable. There are some bits I could pick at – e.g., the explanations for anisotropic filtering and for HDR – but overall it’s nicely done.
  • While I wasn’t looking, Microsoft added embedding of 3D models into Powerpoint, supporting six formats. I also just learned that the Minecraft Bedrock edition has binary glTF export built in, one of the six formats. And, worth repeating, Twitter allows Sketchfab models to be embedded, example here. That said, bummer: Facebook has pulled support for displaying 3D models in its system, I’m not sure why (3D photos, which are entirely fun, remain).
  • GauGAN won two awards at Real-Time Live!video snippet. The fun part is that there’s a demo online. It’s a YMMV system, sometimes great, sometimes surprising. Note you need to scroll down and check the terms & conditions checkbox. Use the paint and fill tools, and control-Z is undo. Don’t miss out on picking styles below. When you’ve goofed off for an hour, see this comicUpdate: this project is now called NVIDIA Canvas, still free for download.
  • This interactive web page is about JPEG encoding, giving you a sense of how this compression works. Occasionally the article’s a bit simplified, occasionally a bit arcane (changing various values at mystery locations), but the brilliant part is that it lets you poke at a JPEG file and see the effects immediately, with links in the article doing the same. It’s worth knowing about; YCoCg encoding can even be applied to real-time image display, saving memory and bandwidth.
  • Test your LCD monitor for blur effects. Read the sparse explanations (I could have used more technical background information), or just enjoy the illusions. Also, a strange thing: on one old Dell LCD monitor I use, scrolling this page up and down causes green to appear where it’s light gray. I have no idea why – explanations appreciated.
  • Math with Bad Drawings is a fun blog, and a great book, one of the best I’ve read this year. It references other good stuff, such as takeaways for teaching from the fascinating autobiography Mind and Matter, and this cartoon, which is relateable (working on Real-Time Rendering: “let’s use n for the normal; oh wait, we’re using that for the number of rows already, m for columns. And abcd and e and f and h and ijk and l and o and q and r and st and uvw and xyz already have meanings… that leaves us, ummm, g, and p. OK, what if we…”)

Please go with “DXR” or “DirectX Raytracing” (secret agenda: ray-traced Minecraft)

This is a post in which I sneak in an announcement at Gamescom from Microsoft and NVIDIA under the guise of an engaging post about terminology. I tell you now to avoid any anxiety or stress from surprise, to keep your heart healthy. The announcement is that official ray tracing support is coming to the Windows 10 Bedrock edition of Minecraft. Video here; it’s lovely:

Now the gripping terminology post:

I’ve seen “DirectX Raytracing” and “DXR” used for Microsoft’s DirectX 12 API extension – perfect. My concern with today’s announcement is seeing “DirectX R” and “DirectX R raytracing” getting used as terms. Please don’t.

OK, I feel better now. Go enjoy the video! It’s nice stuff, and I say that as an entirely unbiased source, other than being employed by NVIDIA and loving ray tracing and Minecraft. I particularly enjoy the beams o’ light effects, having played with these long ago.

Minecraft fans: It will work on only the Windows 10 Bedrock Edition, not the Java Edition, so Sonic Ether’s Unbelievable Shaders project (which offers some ray tracing, but currently does not use RTX hardware) is unaffected. There are some technical details on Polygon, and at the other end of the spectrum, various musings on Reddit.