Proposal: NFTNYC MetaFactory Popup / Digiphysical Launch Party

Backlog updates

Apologies for the week gap, been in crunch mode preparing for the event and this thread has for awhile served as documentation that’ll come in handy for post production.

UV mapping

Previous notes (expand below)

Just want to mention that this is how the video looks like when exporting into the VRChat version of lume studios. It’s 9600x1080 scrunched up into a 1080p video essentially (whatever youtube plays by default but if you self host it can default to a high resolution like 8k automatically).


Watch on YouTube: Lume Anamatic Rough 1 - YouTube

We’re able to pretty quickly preview the wall screens if all merged into one video. It looks cool when one side is content from the digital world and the other side is video from the physical event. We should lean into this setup for a VR encore event, or perhaps use the floor / ceiling to blur reality between them.

This UV map proposes that in the future instead of 1 render texture target we could have 1 per projector screen. There are certain limitations with the video players that we used that prevented us from doing such in VRchat.


Planning

We all synced up on Figma and Zoom with the venue owner to do some floor planning of how the space might look throughout the main events.

Boomboxhead then modeled out the VR space based on the floor plan to give us a preview of the layout spacing.

We added some props on top of the pedestals to create conversation pieces that help tell the narrative of what MetaFactory does from the physical to interoperable digital goods.

jinexplains-optimized

Here’s a preview of how the physical and the digital venue look next to each other on the morning of the event before building the physical event began.


Videos

For much of the past couple weeks we focused heavily on producing video content for the event, down to the last second before the show. We’ve adopted Frame.io to upload and get rapid feedback which was really clutch. Frame was fast to comment and scrub through and integrated directly into adobe software for editing.

Sometimes while editing me and boomboxhead ran into space issues because of how big large resolution projects can get. Only later we found out that Lume uses VJ software like Resolume and supports NDI which gave us lots of power and flexibility on how we can display content onto the screens.

Overall we uploaded over 160gb of digital content to Frame. I recorded tons of 4k 360 video to see how they would look displayed in the physical venue, although I don’t think we got to use any for such purpose.

I recorded a few looped videos for the downstairs VIP blackroom screens:


MF Store

We wanted to pre-record some content as a plan B + a few practice rehearsals for presenting the vision. In one night Boomboxhead basically forked the MF shop project into something almost totally new / revamped.

Before


After

For a presentation we gathered in the lounge area. I was a 360 camera to capture in 360 from various angles (so we can test later how they look in Lume studio setup) while boomboxhead asked Drew (green screen) questions as DAOFREN and a guest cat joined us. Clips of this can go into a post-production documentary.

During production boomboxhead worked on a 3D model photo studio that was inside VR. With a hotkey we can switch between camera views and take pics / video from our desktop screens. More deets will be in a separate thread (fashion shoots related).

1 Like

Day after event

This project was intense but also so much fun. Friends from NeonDAO came by. Founder of LUME wants to connect with us in VR he has a Vive and PC. Pab killed it with the POAP designs for this event, props. I made them into 3D for fun.


POAPs for for attendees / VIP / virtual attendee (VR part 2)

For my presentation I made great use of M3 and MF shop websites with hackmds open relevant to DAO tools / avatar interop in my background with pureref + vtuber w/ metaphysical merch enabled. This is what it looked like:

Here’s a quick memory collage I made after the event as a remote buildooor / participant

Challenges

We used NDI into an enterprise Zoom solution to get back n forth communication (we could hear whoever had one of the wireless mics clearly). Feedback from the crowd was noise cancelled thanks to the AI.

Unfortunately me n Arashi both had some bad luck with audio / network. Also next time lets triple check stream audio since youtube algos took it down for some reason. Remote AV can be finnicky sometimes, but aside from that it things were smooth based on how I heard zero feedback about latency and quality.

Coordination was another challenge as a DAO producing a physical event in a tight schedule while working remotely. Here is what my pureref looked like while working on this and the fashion shoot for the past couple months. It was helpful during community calls as a big picture visual aid and stuff can be copy pasted into figma for easy sharing.

M3 would love to explore future collaborations with LUME. Having the virtual to plan, previs / rehearse, simulcast and archived is a super powerful compliment to physical event planning / coordinating. Physical events can be opportunities for creating assets for virtual production shorts.


See more: Studios — LUME Studios

People hit me up asking how the event went, and how the behind the scenes posts was like watching a documentary which was awesome. It’s typically uncommon to share behind the scenes stuff until after the release, but we like to build in public - this very URL is proof of that :sunglasses: :+1:

The behind the scenes of virtual productions is a peek into the future of work with AR/VR technologies. We want to show that the open metaverse / decentralized hollywood is being built right before everyone’s eyes to send a signal to more devs / artists / collaborators. The best way to do that IMO is by treating it like an open source project and not being afraid to share WIP along the way.

Also all the content we captured thus far can someday be properly treated into a future movie or event about the open metaverse movement.

3 Likes

Few of the folks I connected with during the event:

  • Tryspace.com (digital e-commerce)

  • Polygine team: working on building graphics tech to enable interoperable digital wearables. Focus is on automating the size scaling of wearables for different sized avatars. Showed me a short demo scaling some cryptovoxel shoes. They want someone like MetaFactory as a partner to test real digital products as proof of concept. (Telegram group for upcoming call/demo.)

  • Neon Coats: female model collective for both physical and digital fashion/wearables. Lort knows we need more feminine energy in Web3:)

  • Soze of course :slightly_smiling_face: He flew out with Jin, got stuck with me lmao

  • Uniqlo upcoming call Thursday.

  • RED / NEON DAO Upcoming Meeting on investment, etc
    → Pitch deck refinement, M3 x MF meeting needed to iron out full-scope alignment regarding raise, tokenomics, etc → Demo of MF-OS hosted by MetaDreamer.

1 Like

This is really clean. The TV setup is fire!

1 Like