laitimes

Meta-universe outlet, how does Amazon get the VR ticket?

Author | Xu Xinyu

Meta-universe outlet, how does Amazon get the VR ticket?
Meta-universe outlet, how does Amazon get the VR ticket?

Introduction to Fidelity Labs

At present, the concept of meta-universe is hot, all kinds of commercial giants are seizing the market, and many meta-universe products have also come into being. Immersion and decentralization are the two core elements of the metaverse. It is foreseeable that in the near future, various applications of VR and AR will blossom in the metaverse. AWS gave us some inspiration, that is, we can implement VR/AR scenes through Amazon Sumerian.

Meta-universe outlet, how does Amazon get the VR ticket?

Above is the incubation project of Fidelity Labs, a fintech incubator owned by Fidelity Investments, which brings a hosted VR experience with data visualization to users in the financial industry through VR technology.

Use avatar + text conversion to broadcast financial trends, and provide a stronger sense of scene and emotional rendering through pre-input gestures. Through VR technology, data is three-dimensional and virtualized.

The seemingly very high effect uses 3D visualization technology, Webgl technology, 3D modeling technology, virtual human technology, NLP (natural language processing) technology, VR technology, text-to-speech technology, etc.

Does that mean that the development of such projects is out of reach? Of course not, we will implement a simple version of the Fidelity Labs demo through Amazon Sumerian.

Meta-universe outlet, how does Amazon get the VR ticket?

Create scenes using Amazon Sumerians

The above mentioned that through Amazon Amplify + Amazon Sumerian, we can also quickly implement web/APP VR applications like this, so what is Amazon Sumerian?

Amazon Sumerian uses the latest WebGL and WebXR standards to create an immersive experience directly in a web browser, access via a simple URL in seconds, and execution on major hardware platforms designed for AR/VR. Set up a scene once and deploy it anywhere.

It's a bit more functional than some professional platforms like 3Dmax, UE4, and U3D. However, his difficulty in getting started is even lower, and the lightweight core module allows many front-end engineers to understand some basic 3D knowledge and get started quickly.

Sumerian also offers libraries including simple shapes, 3D models, Hosts, textures, and scripts, asset packages related to virtual reality (VR), and templates, assets, and sample projects for developing augmented reality (AR) applications for iOS and Android devices for rapid development.

Then we can achieve a lot of capability expansion such as 3DWeb, augmented reality, virtual reality, training simulation, manufacturing scenario application + case. The most important thing is that it is web-based, and the scalability and adaptability become very good.

Let's take a simple example to implement a lite VR application, creating a scene with a Sumerian asset library, state machine animations, and support for virtual reality (VR). It is then imported into the application created by Amazon Amplify to implement the closed loop.

Create a project

Since Amazon Sumerian doesn't require 3D graphics production or programming experience to build rich interactive VR and AR scenes, let's quickly create projects using Sumerian dashboards.

Meta-universe outlet, how does Amazon get the VR ticket?

In the Sumerian dashboard, it's easy to create a new scene at the push of a button.

Meta-universe outlet, how does Amazon get the VR ticket?

The default view of the new scene opens and appears in the Sumerian editor. Once you open the scene you just created in the editor, you can easily import assets into My Scenes.

Meta-universe outlet, how does Amazon get the VR ticket?

Import Host

The next step is to import Host from the Sumerian library. Host is a character model provided by Sumerian with built-in animations and support for voice.

1. Select Import Assets

2. Select Cristine and then, select Add

Meta-universe outlet, how does Amazon get the VR ticket?

When the asset package finishes loading, drag the Host entity from the Asset panel to the canvas and configure the behavior.

Meta-universe outlet, how does Amazon get the VR ticket?

We can add behavior to the scene by attaching scripts or state machine components to entities.

With state machines, you can add behavior visually by selecting the actions triggered by an event and organizing them into states. Add a state machine to the Cristine entity to animate it and its children.

For example, if we want to control the model changes of the character, we can use the state machine to animate Cristine.

1. Select Cristine in the entity panel

2. In the Inspector panel, select Add Component. Then select Statemachine (State Machine)

3. Click the plus sign next to the Behavior field to create a new behavior. Named it animate

These areas are: The State Machine panel will be displayed, which has a single status. Name the status up, and then select Add Action

4. In RELATEAnimate, select Rotation between T. Apply the following properties:

  • Z rotation – 24
  • Relative—Disabled
  • Time–1000
  • Slow type – Linear
  • Direction –In

5. Select Add Action, and then add a Wait (Wait) action.

  • Time–2000
  • Random –0

6. In the state machine panel, select Copy Status Copyup. Double-click the new state to open it in the Inspector panel

7. Change its name to down, and then change the rotation value from -24 to 0

8. In the state machine panel, each state shows two events, one event occurs at the end of the animation and the other event occurs at the end of the waiting operation. Click wait for the event to end up, and then drag a line to the Edge Down state. Then do the same thing in the opposite direction

Meta-universe outlet, how does Amazon get the VR ticket?

Such a simple animation is done.

Configure voice for Host

Then we add voice to the character, selectIng Host in the canvas or solid panel:

1. In RELATE voice, put some text files on the speech field

2. Select voice for Host

3. Add a script component to Host

4. Click the plus sign next to the script field and select Custom (Custom)

5. Click Edit (pencil icon) in the script instance parameter to open the script in a text editor. You can also press the J key to do so

6. Open the text editor at any time

7. Double-click the document list in script name to change the name to RandomSpeech

8. Replace the placeholder setup function with the following code:

Meta-universe outlet, how does Amazon get the VR ticket?

This script waits for the AWS SDK to load and retrieve the credentials. It then gets a reference to the speech component on the same entity (Host) and gets a list of all the additional voices. It selects the voice from the array and plays it.

Go back to the Speech component and click next to each speech file to add a gesture marker. In this way, we can also use the built-in gesture animation to make the character wave a gesture while speaking.

Add VR mode

Until now, we have only used standard cameras in playback mode. Next we add a virtual reality (VR) device that allows users to view the scene in 3D using VR headsets and head tracking.

1. Select Import Assets, and then add the CoreVR Asset Package to your scene

2. When the asset pack finishes loading, drag the VRCamerARig entity onto the canvas to add it to your scene

3. Select the VRCamerARig entity

4. Select the VRCamerARig component

5. Select the current VRCamerARig option to enable the device

6. Uncheck the Start from current camera option. In this way, when the user enters VR mode, it is possible to use the camera installation placed in the scene instead of using the position of the non-VR camera

7. In the entity panel, drag the VRCamerARig entity onto the Cristine entity to make it a sibling of the user camera

8. Select the user camera. Click the gear icon in the Transform section and choose Copy to copy the transform values

Select the camera camera. Click the gear icon in the Transform section and select Paste to paste the transform value from the user camera

10. Use the green transform handle to adjust the height of the VR camera relative to Cristine

After debugging, our scene is complete.

Publish the scene

Click the Publish drop-down menu in the upper-right corner of the Sumerian console and click Private Hosting:

If your scene has already been publicly published, you will need to unpublish and then publish again using the instructions below.

Meta-universe outlet, how does Amazon get the VR ticket?

You will then be prompted to display the following dialog box. Click the Publish button:

Meta-universe outlet, how does Amazon get the VR ticket?

Now click the Download JSON Configuration button to download the scene configuration JSON that will be used to configure your scene in Amazon Amplify:

Meta-universe outlet, how does Amazon get the VR ticket?
Meta-universe outlet, how does Amazon get the VR ticket?

Amazon Amplify creates the app and imports the configuration

We have completed the construction of a scene, but there is still a lack of carrier applications to show our VR scene. Next we'll use Amazon Amplifyy to build the app.

Amazon Amplify is a set of purpose-built tools and services that enable front-end web and mobile developers to quickly and easily build full-stack applications on AWS and have the flexibility to further customize applications with a wide range of AWS services. Amplify supports JavaScript, React, AngulAR, Vue, and Next .js for web apps, as well as Android, iOS, React Native, Ionic, and Flutter for mobile apps. Quickly and easily build an app and convert ideas.

Let's launch an app in 5 minutes via Amazon Amplify in a few simple steps:

1. Check the environment

Verify that you are running at least Node.js version 12.x and npm version 6.x or later by running node/console window. I'm here node version 14.15.4 and npm version 6.14.10.

Meta-universe outlet, how does Amazon get the VR ticket?

2, computer installation aws amplify scaffolding, similar to vue/cli

npm install -g @aws-amplify/cli

3. Configure Amplify

amplify configure

Open a browser and log in to the AWS console to create an IAM user

4. Create a react front-end application

npx create-react-app amplify-vr-app

cd amplify-vr-app

5. Then initialize the project, and gradually configure it according to the default needs

amplify init

Meta-universe outlet, how does Amazon get the VR ticket?

6. Install the Amplify React library @amazon-amplify/ui-react (including React UI components)

npm install@amazon-amplify/ui-react

7. Create a front-end for the application. Open the src/App .js file file and replace its entire contents with the following code:

Meta-universe outlet, how does Amazon get the VR ticket?

After the service is started locally, the effect is displayed:

npm start

Meta-universe outlet, how does Amazon get the VR ticket?
Meta-universe outlet, how does Amazon get the VR ticket?

Isn't it very fast! In addition to this, AWS offers Amazon Amplify Studio, which allows developers to quickly build web applications on AWS with minimal coding. And AWS not only creates applications fast, but also full-process full-stack ecological chains. Ensure a balance between quality and R&D schedule.

Set the configuration

After the app starts, we want to run the scene into the app, first we add the XR module package through amplify add xr, and the CLI will prompt the configuration options of the XR category, such as the scene name identifier and the Sumerian scene json configuration file.

The XR category will set up and utilize the Auditing category to set up scene authorization.

Make a note of the scene name you provided in the CLI prompt. When using the XR category in the framework, this value is used as an identifier for the scene.

Update the amplify push after configuring the XR option.

The XR resource amplify push displayed during the period represents the IAM policy created for the scene. This does not change the scene displayed in the Sumerian console. A configuration file named aws-exports.js will be copied to the source directory you configured, for example./src. The file will contain a list of all the scenarios that you configured.

Import an existing XR resource

Add the following code to your application to configure the XR category:

Meta-universe outlet, how does Amazon get the VR ticket?

You can add optional publishing parameters to the scene configuration:

Meta-universe outlet, how does Amazon get the VR ticket?

To configure the front-end:

Meta-universe outlet, how does Amazon get the VR ticket?

More information about using amplify xr Amplify CLI types in amplify projects can use the amplify xr help command.

Scene usage

The XR category allows methods to render Sumerian scenes as DIVHTML elements loadScene. After the scene is loaded, the XR.stARt method starts the scene. To render the scene, pass the scene name and the id of the element in the method call:

Meta-universe outlet, how does Amazon get the VR ticket?

In addition, you can use The Summer Scene UI components for out-of-the-box UI solutions.

Development-delivery-management of the whole process of the whole stack chain and peripheral services

We've actually implemented a simple FidelityLabs demo now. What if we want to continue to improve the entire development link, such as the web side or the backend supplement and so on? Amazon Amplify is certainly supported

An important part of AWS Amplify's complete ecosystem is its development-delivery-management full-stack chain.

Includes but is not limited to the Amplify command-line interface (CLI), Amplify UI Components, Amplify library, Amazon Amplify console and other modules.

We can do it according to personal preferences and habits. In the actual project, we use Amazon Amplify Studio's simple point-and-click visual interface or Amplify command line interface (CLI) to create our backend, and Amazon Amplify automatically presets AWS services (e.g. Cognito for authentication, Amazon DynamoDB for databases, S3 for storage by Amazon, etc.). Once these services are available, we can create web applications using Amazon Amplify Studio.

So for the UI layer and component layer, AWS provides UI libraries like the industry's antd, element, vant. But more appropriately, AWS also comes with an Amplify library that makes it easy to invoke some of the function capabilities. Even AWS offers low-code solutions.

This means that we can build UIs using a library of pre-built UI components, integrate data or functionality from AWS services into the UI, and collaborate with UX designers through integration with Figma, a popular tool for designing and prototyping UIs—all without writing any code. Once the UI is complete, Amazon Amplify Studio automatically converts it into JavaScript or TypeScript code, so we have the flexibility to fully customize the design or behavior of our application to provide the best end-user experience while improving the efficiency of R&D.

In addition to the above can quickly produce 3D VR applications, AWS has also incubated the Amazon Lumbery ARd open 3D engine. This is more suitable for more professional engineers to carry out differentiated development.

This means that we can implement a demo, but not just one.

In addition to the XR capability set, there are other services around the Amazon Sumerian ecosystem, such as aws AI, aws ML (immersive machine learning), aws geo, aws bots, and more.

This means that we can quickly build an immersive application through a plug-and-play combination of AWS capabilities.

In the post-epidemic era, the norm of online office and online participation has become possible. We can create an immersive attending app with Amazon Location Service. Let the audience get an immersive experience online, and realize the grandeur of the offline forum in the cloud to achieve a digital twin.

Through the amplify Geo Amazon Sumerian integration, we can achieve 3D navigation, such as large exhibition hall participation navigation applications, in addition to providing maps + 3D real scenes. It is also possible to conduct a lecture through the Al+ML+ robot. Through NPL natural language processing, it is possible to process emotional feedback in the communication of participants and provide a collection of data for the satisfaction measurement of the conference.

Attendees can also provide accurate information about routes, locations, distances, and estimated commute times. In this way, whether it is online and offline participation, you can provide differentiated services.

Meta-universe outlet, how does Amazon get the VR ticket?

Read on