美文网首页SAP 实用篇SAP札记SAP 修行
150行JavaScript代码实现增强现实

150行JavaScript代码实现增强现实

作者: _扫地僧_ | 来源:发表于2019-01-10 10:34 被阅读5次

    增强现实技术(Augmented Reality,简称 AR),是一种实时地计算摄影机影像的位置及角度并加上相应图像、视频、3D模型的技术,这种技术的目标是在屏幕上把虚拟世界套在现实世界并进行互动。这种技术1990年提出。随着随身电子产品CPU运算能力的提升,预期增强现实的用途将会越来越广。

    本文介绍使用JavaScript开源框架AR.js实现的增强现实的Hello World例子。

    先看效果:

    首先在手机浏览器里打开我部署在github page上的这个demo应用:

    https://i042416.github.io/FioriODataTestTool2014/WebContent/098_ar.html

    我用的是Android手机安装的Chrome浏览器。

    打开网页,会提示你是否允许这个网页应用访问您的手机摄像头。点击允许:

    用手机上的摄像头扫描这张图片:

    神奇的事情就发生了。您会看到,通过手机摄像头望过去,手机屏幕里会出现一个新的不断滚动的3D物体,如下图所示。

    下面具体介绍这个最简单的例子是怎么开发出来的。

    所有的源代码在我的github上:

    https://github.com/i042416/FioriODataTestTool2014/tree/master/WebContent/ar

    新建一个html文件,把下列150行代码粘贴进去,然后在服务器上运行,使用之前描述的步骤即可进行AR测试:

    <!DOCTYPE html>
    <meta name="viewport" content="width=device-width, user-scalable=no, minimum-scale=1.0, maximum-scale=1.0">
    
    <script src='ar/lib/three.min.js'></script>
    <script src="ar/lib/stats.min.js"></script>
    <script src="ar/lib/ar.js"></script>
    
    <script>
    debugger;
     THREEx.ArToolkitContext.baseURL = '';
    </script>
    <body style='margin : 0px; overflow: hidden; font-family: Monospace;'>
    <div style='position: absolute; top: 10px; width:100%; text-align: center; z-index: 1;'>
    
    <script>
    
        var renderer    = new THREE.WebGLRenderer({
            // antialias    : true,
            alpha: true
        });
        renderer.setClearColor(new THREE.Color('lightgrey'), 0)
        // renderer.setPixelRatio( 1/2 );
        renderer.setSize( window.innerWidth, window.innerHeight );
        renderer.domElement.style.position = 'absolute'
        renderer.domElement.style.top = '0px'
        renderer.domElement.style.left = '0px'
        document.body.appendChild( renderer.domElement );
    
        // array of functions for the rendering loop
        var onRenderFcts= [];
    
        // init scene and camera
        var scene   = new THREE.Scene();
        
        var camera = new THREE.Camera();
        scene.add(camera);
        
        var arToolkitSource = new THREEx.ArToolkitSource({
            // to read from the webcam 
            sourceType : 'webcam',
    
            // to read from an image
            // sourceType : 'image',
            // sourceUrl : THREEx.ArToolkitContext.baseURL + '../data/images/img.jpg',      
    
            // to read from a video
            // sourceType : 'video',
            // sourceUrl : THREEx.ArToolkitContext.baseURL + '../data/videos/headtracking.mp4',     
        })
    
        arToolkitSource.init(function onReady(){
            onResize()
        })
        
        window.addEventListener('resize', function(){
            onResize()
        })
        function onResize(){
            arToolkitSource.onResize()  
            arToolkitSource.copySizeTo(renderer.domElement) 
            if( arToolkitContext.arController !== null ){
                arToolkitSource.copySizeTo(arToolkitContext.arController.canvas)    
            }   
        }
    
        var arToolkitContext = new THREEx.ArToolkitContext({
            // cameraParametersUrl: THREEx.ArToolkitContext.baseURL + '../data/data/camera_para.dat',
            cameraParametersUrl: 'ar/data/data/camera_para.dat',
            detectionMode: 'mono',
            maxDetectionRate: 30,
            canvasWidth: 80*3,
            canvasHeight: 60*3,
        })
    
        arToolkitContext.init(function onCompleted(){
            camera.projectionMatrix.copy( arToolkitContext.getProjectionMatrix() );
        })
    
        onRenderFcts.push(function(){
            if( arToolkitSource.ready === false )   
                return;
            arToolkitContext.update( arToolkitSource.domElement )
        })
        
        var markerRoot = new THREE.Group
        scene.add(markerRoot)
        var artoolkitMarker = new THREEx.ArMarkerControls(arToolkitContext, markerRoot, {
            type : 'pattern',
            patternUrl : THREEx.ArToolkitContext.baseURL + 'ar/data/data/patt.hiro'
        })
    
        // build a smoothedControls
        var smoothedRoot = new THREE.Group()
        scene.add(smoothedRoot)
        var smoothedControls = new THREEx.ArSmoothedControls(smoothedRoot, {
            lerpPosition: 0.4,
            lerpQuaternion: 0.3,
            lerpScale: 1,
        })
        onRenderFcts.push(function(delta){
            smoothedControls.update(markerRoot)
        })
    
        var arWorldRoot = smoothedRoot
    
        // add a torus knot 
        var geometry    = new THREE.CubeGeometry(1,1,1);
        var material    = new THREE.MeshNormalMaterial({
            transparent : true,
            opacity: 0.5,
            side: THREE.DoubleSide
        }); 
        var mesh    = new THREE.Mesh( geometry, material );
        mesh.position.y = geometry.parameters.height/2
        arWorldRoot.add( mesh );
        
        var geometry    = new THREE.TorusKnotGeometry(0.3,0.1,64,16);
        var material    = new THREE.MeshNormalMaterial(); 
        var mesh    = new THREE.Mesh( geometry, material );
        mesh.position.y = 0.5
        arWorldRoot.add( mesh );
        
        onRenderFcts.push(function(){
            mesh.rotation.x += 0.1
        })
    
        var stats = new Stats();
        document.body.appendChild( stats.dom );
        // render the scene
        onRenderFcts.push(function(){
            renderer.render( scene, camera );
            stats.update();
        })
    
        // run the rendering loop
        var lastTimeMsec= null
        requestAnimationFrame(function animate(nowMsec){
            // keep looping
            requestAnimationFrame( animate );
            // measure time
            lastTimeMsec    = lastTimeMsec || nowMsec-1000/60
            var deltaMsec   = Math.min(200, nowMsec - lastTimeMsec)
            lastTimeMsec    = nowMsec
            // call each update function
            onRenderFcts.forEach(function(onRenderFct){
                onRenderFct(deltaMsec/1000, nowMsec/1000)
            })
        })
    </script></body>
    

    当然,这个效果来自大神jeromeetienne开源的AR.js:

    https://github.com/jeromeetienne/AR.js/

    当然大神自己也很谦虚地提到,他这个开源的增强现实框架也是建立在巨人的肩膀上开发的:比如其中3D效果的绘制,使用到了另一个开源框架three.js:

    要获取更多Jerry的原创文章,请关注公众号"汪子熙":

    相关文章

      网友评论

        本文标题:150行JavaScript代码实现增强现实

        本文链接:https://www.haomeiwen.com/subject/hdmorqtx.html