Author Archives: Eric

OBJ lives on

Yesterday I accidentally drag-and-dropped an OBJ file onto Visual Studio 2022 (I was aiming for G3D’s viewer, which I love for OBJs, despite one annoying bug – hoping it’s fixed someday). To my surprise, VS2022 opened and displayed it! It even looked fairly reasonable. Googling a bit, they seem to also support FBX and DAE (who uses DAE nowadays?). Sadly, this feature doesn’t work for glTF or USD files (yet…).

The controls are little odd, e.g., use the mouse wheel and the model moves a bit up and down. But, control-mousewheel does what you’d expect, move in and out. Shift-mousewheel moves right and left. Control-leftmouse pans, alt-leftmouse rotates. Lots of other UI buttons around the viewport to explore. Fun!

VS2022, you surprised me. G3D’s prettier view follows.
G3D, with customized background. However, there’s a bug where the R of RGBA is interpreted as the A.

USD and glTF Resources Page

Here’s my new page about USD and glTF resources: https://www.realtimerendering.com/usd_gltf.html – if you see anything missing or wrong, let me know. I’m no expert, still learning. Which is in part why I put this page up, to get feedback.

This past year I made two minor contributions to the USD Working Group Assets repo. Being involved in the ASWF‘s activities and Slack channel has been a great help in learning about USD (and also a bit about glTF along the way). Fun thing: if you have an iPhone, click here – I like that Apple’s AR Quick Look is simply built in. Here’s one Minecrafty test model on my office floor.

Lights on, no one quite home

I gave a talk about ray tracing yesterday. For my entertainment, before the talk I asked ChatGPT what ray tracing is. Its full answer was pretty good overall. However, it stumbled at the start – I’ll let you figure it out (hint: Plato). There are some other small oddities (e.g., “follow similar paths”) and “I wouldn’t have put it exactly like that” bits. Here’s the question and its answer.

What is ray tracing?

ChatGPT:
Ray tracing is a rendering technique used in computer graphics and computer-generated imagery (CGI) to simulate the way light interacts with objects in a virtual environment. It is a method for generating highly realistic images by tracing the path of individual rays of light as they interact with the objects in a scene. Ray tracing can simulate complex optical effects, such as reflections, refractions, shadows, and global illumination, with a high degree of accuracy and realism.

Here’s a simplified explanation of how ray tracing works:

1. Ray Generation: In a ray tracing system, a virtual camera is placed within a 3D scene. Rays of light are generated from the camera’s viewpoint, with one ray typically cast for each pixel on the screen.

2. Ray Intersection: Each ray is traced into the scene, and its path is followed as it interacts with objects. When a ray encounters an object, the software calculates how it intersects with that object’s geometry.

3. Shading and Lighting: After determining the intersection point, the renderer calculates how the object at that point interacts with light. This involves considering factors like material properties (e.g., reflectivity, transparency), surface normals, and lighting conditions (e.g., direct and indirect lighting).

4. Secondary Rays: In addition to primary rays cast from the camera, secondary rays may be generated to simulate effects like reflections, refractions, and shadows. These rays follow similar paths as primary rays but may be traced in different directions based on the materials and lighting in the scene.

5. Recursive Ray Tracing: To simulate complex lighting interactions, ray tracing often involves recursion. For example, when a ray hits a reflective surface, a new ray is generated to simulate the reflection, and this new ray undergoes the same process of intersection and shading. This recursion continues until a termination condition is met.

6. Image Formation: As the rays interact with objects and lights in the scene, they carry information about the color and intensity of light at each pixel. This information is used to generate the final image that is displayed on the screen.

Ray tracing is known for its ability to produce photorealistic images, making it a popular choice in fields like computer gaming, visual effects in film and television, architectural visualization, and more. However, it is computationally intensive and can require significant processing power, so real-time ray tracing in complex scenes is still a demanding task that often requires powerful hardware, such as high-end graphics cards. Advances in hardware and software continue to make ray tracing more accessible and practical for a wider range of applications.

HPG 2023 and EGSR 2023 Paper Links Available

The indefatigable Ke-Sen Huang has collected links for papers at both HPG 2023 and EGSR 2023:
HPG at https://www.realtimerendering.com/kesen/hpg2023Papers.htm – includes direct links to recordings
EGSR at https://www.realtimerendering.com/kesen/egsr2023Papers.htm

I’ve heard both were well-attended this year (and I’m sad I missed them – it didn’t time out), with about 150 attendees at EGSR, something of a record.

Elena Garces and I are co-chairing papers for EGSR 2024, so keep up the great work and interest, all! I would expect the papers deadline to be around April, similar to this year’s Call for Papers deadlines.

Coolerer Color QR Codes

It turns out that, at the bottom right of the project page I pointed to in my previous post, there’s a link to a Windows implementation of their system. I hadn’t noticed that; thanks to James Hung-Kuo Chu, first author of the work, for the tip-off. He has a nice example of a colored QR code on his web page.

The Windows program works! Set a URL or text to search, upload a picture, ask for a QR code.

Here’s an example. Blur your eyes to help you figure out what well-known computer graphics image it represents. I find you’ll want to have the QR code be not very large in your camera’s view for it to be detected as one.

A smaller one, of me. Again, blur eyes. When changing the size it’s likely best (?) to “pixel resize” (sample and hold, etc.), not resampling.

Addendum: James Hung-Kuo Chu, the first author on the paper, noted that using the “Apply contrast enhancement” can give a better result. Here’s one with that:

Addendum 5/28/2023: I just noticed this article, pointing to another QR image generator. The color one’s more fun, IMO, but I wanted to scribble down where this other tool is.
And addendum 6/7/2023: this cool QR maker guided by Stable Diffusion (thanks to Andrew Glassner for pointing it out). Actual creator tool here.
Also, 12/14/2023: a research topic is to make a 2D barcode that tries to be more aesthetically pleasing than a QR code. Me, I think that ship has sailed – adopting a new barcode format seems unlikely – but, interesting topic and a view of “what might have been.”

Cooler QR Codes

In talking with Niloy Mitra, I noticed he had a cool QR code on his page. I asked him about it, he pointed me to this decade-old work he coauthored. Searching a bit more, I found a cute web browser version for making your own. It’s a bit flaky, in that a lot of the images I tossed in look really bad half-toned. Maybe if I had a clue what the options were it’d come out better. But, with one it worked pretty well! I just put in the URL I want the code to lead to.

Addendum: but wait, there’s more! Another generator, from the author, that also permits color output. See my next post.

SIGGRAPH 2023 Hotel Reservations Now Open

Like it says. Hotel reservations for SIGGRAPH 2023 are now open: https://s2023.siggraph.org/travel-accommodations/ – from what I see, you can cancel cost-free up to the end of July. So, if there’s any chance you’ll go, lock it in now. To save you a click, SIGGRAPH is August 6-10 in Los Angeles.

It’s SIGGRAPH’s 50th year, so I expect a fair bit of retrospective stuff. In fact, the Electronic Theater is accepting such material until May 1st, so if you’re of an age, dust off those VCR tapes for submission.

I3D 2023 is in-person, at last

I3D 2023 is back to being in-person, first time since 2019. May 3-5, hosted at the Unity office in Bellevue, WA, USA.

The last in-person I3D was in 2019. I helped chair the conference on-line in 2020 and 2021, and 2022 was also remote. We survived, good work got published, but half the joy of any conference is meeting with people, new and old.

The nice thing about on-line conferences is that they cost just about nothing. Which means all the money we raised from sponsors in 2019 and later is available for this year’s conference. Almost all sponsors generously rolled over their donations each year for the time when the conference would again be in person. That time is now! I don’t know, but suspect there should be a fair number of new people attending this year, as there was money in 2020 (before COVID hit) for student travel grants and outreach programs.

Please spread the word and retweet.

Seven Things for December 26, 2022

Today’s theme is interactive mash-ups and remixes:

My improved homepage