The issue explored in SpatialFlow stems from the combination of two interconnected areas. Existing multi-dimensional data and projects have complex information that can be difficult for users to understand and develop upon. Examples of this include navigating through years of computer code, or revisiting a node-based or multi-layered programming project (such as a node-based material or shader program developed by a technical artist, or a three-dimensional architectural form constructed using algorithms).
With the complexity of displaying complex data, “some low-dimensionality projection of high-dimensional structures would smear out the structures that may be present in the data, and thus render them effectively unrecognizable” (Donalek, Ciro, et al. 609). A two-dimensional approach has been widely adopted for many data visualizations as well as for node-based programming, and there is potential to improve this process as we are “biologically optimized to see the world and the patterns in it in 3 dimensions” (Donalek, Ciro, et al. 609). There has been some recent previous exploration done in the viewing of existing data, however, there is a gap in the creation and manipulation of data flows in three dimensions for improved comprehension. Current AR, VR, and mixed reality software often does not make full use of the three-dimensional medium, with many digital VR games, for example, directly taking what works on a two-dimensional screen and displaying it using a head-mounted display instead. Remote work and collaboration presents its own challenges in the current landscape, where many collaborators feel isolated and that there are barriers to communication as well as a disconnect between users. Data flowcharts and design projects can become so complex that it can become difficult to understand and communicate the information, hindering personal and collaborative work. Collaborating remotely can be isolating, as well as make it more difficult to communicate and work with team members.
Theoretical Background and Framework
SpatialFlow builds upon histories of human-computer interaction and user interface design, exploring possibilities for future interfaces following the speculative spatial evolution of computer inputs. Through a speculative design methodology, questions are generated from experiments in alternative interactions within spatial computing. In SpatialFlow, data flow is defined as information that moves through different states and is thereby transformed. Visual programming follows in a similar way. It can be described similar to creating a flow chart, with each block or ‘node’. It is currently widely used in software tools such as Unreal Engine, Unity, MaxMSP, Rhino Grasshopper, Touch Designer, as well as other software. In Rhino’s plugin Grasshopper, for example, “each of these nodes performs a specific operation given one or more inputs, and creates one or more outputs. The inputs are represented as semi-circular ‘ports’ on the left of the node, while the outputs are represented as ports on the right side” (Nagy 2017). This type of visual programming is referred to as dataflow visual programming.
It can be conceived as a conveyor belt connecting a system of factory machines. By extension, SpatialFlow can be considered as an exploration into the augmentation of the user experience of creative programming itself, stemming from a critique of current practices in the processes of programming. In SpatialFlow, this opens up room for better collaborative visualization in environmental design, where designs can be viewed and manipulated in the 3D space. In SpatialFlow, users manipulate nodes in thin air by ‘grabbing’ differently coloured blocks and orienting them in space. Different aspects can be changed by pressing buttons, grabbing levers, and hitting switches. This approach makes the design process more skeuomorphic, tangible, and visible.
RiftSketch, created by Brian Peiris, is a virtual reality software application that allows a user to program visualizations inside a virtual reality environment. The program “presents you with a barren virtual world and a text editor. You type code into the text editor and the code creates and manipulates objects in the world” (Durbin 2016). The user can witness the results of their code appear in front of them, and can view and interact with their creations in a richer way. RiftSketch offers a compelling visual experience, modifying the way that the creator interacts with the creation, bringing both agents into the same virtual space. However, it does not attempt to fully augment the user experience of creation itself. There is much room to explore how the experience can be shifted from the normative keyboard-and-text interface. In SpatialFlow, this has inspired the ability for users to move away from the keyboard, interacting and visualizing the logic flow itself, with the ability to move around and use this created structure for other purposes within the environment. Users can copy their generated structures or buildings, creating a cityscape with their objects.
Dynamicland is a spatial computing project and new tech organization located in Oakland, California. They are dedicated to understanding how user interfaces will evolve over thousands of years. Dynamicland transforms a building into a computer–the UI can be a wall, or a table. Paper and sticky notes can be found everywhere. The ceilings are covered in cameras and projectors. Each individual section of paper is its own piece of the Lua programming language. Some printouts have their own code printed on them. One use is with paper representing datasets, and moving them together to the graph sheet, bringing the data into the graph. Another page uses a stick as a slider to pick the year of information visualized. Multiple users can interact with a program and the code directly, and together in the same space. Dynamicland as a creative space reflects the vision of SpatialFlow, creating an environment encouraging collaboration and playful experimentation, as well as movement through the space. The concept of using physical objects differs from that of SpatialFlow, where the user currently only manipulates virtual objects in the room. Dynamicland’s displays are rather flat, as they are based on sheets of paper, while SpatialFlow’s visualizations are three-dimensional for the most part. It will be interesting to see how the physical and the virtual can be best combined for ease of use. Dynamicland has inspired the transition of SpatialFlow’s prototype into more of an open-office plan environment, where creators may feel more at home. This allows users to understand the context of the space they are creating in, making it more believable and relatable to real working environments.
iViz, developed by a team of researchers at the California Institute of Technology, is a general-purpose collaborative VR data visualization tool for the scientific community. The project allows for a rendering of datasets of large scales, with many dimensions that are difficult to represent two-dimensionally. In the software, “the XYZ spatial coordinates (in some data parameter space, not the physical space), colors, sizes, transparencies and shapes of data points, their textures, orientations, rotation, pulsation, etc., [are used] to encode a maximum number of data dimensions” (Donalek, Ciro, et al. 611). In this virtual space, users are rendered as avatars in the same virtual space as the data. As well as each user having their own viewpoint, there is a feature to ‘broadcast’ one view from one user who navigates through the data. It is possible to link data points to web pages in the software. This interlinking provides a richer experience, interlinking data with its context. Insight gained iViz inspires features in SpatialFlow, such as possibilities for users to present their work, and link between different contexts in the same interface. The overall theme of SpatialFlow is an augmented office in which environmental designers and architects can collaborate, and interact with the holographic inputs to switch between contexts of parametric modelling, building design, and city planning.
Significance of SpatialFlow
SpatialFlow makes a contribution in Human-Computer Interaction (HCI), and by extension, User Experience Design (UX). HCI aims to innovate the dialogue between humans and machines through the combination of human factors engineering and cognitive science, among other fields. “If there is one device that symbolizes the emergence of HCI, it is the computer mouse…[it] was destined to fundamentally change the way humans interact with computers. Instead of typing commands, a user could manipulate a mouse to control an on-screen tracking symbol, or cursor” (MacKenzie 6). As an extension of the concept of the mouse as a tool to interact with a computer, Spatial Flow Visualization experiments with the potential of a richer level of communication between users and computing devices through the paradigm of digital interfaces in spatial computing. Designers have the ability to better visualize and understand their processes and 3D creations.
As a UX/UI designer, I am interested in exploring ways in which users can interact with technology in truly meaningful ways. With SpatialFlow, this involves using technology for collaborative ‘creative coding’ and environmental design practice. I personally find that there is much room to deepen the experience of design workflows, and to explore the augmentation of a space allowing for collaborative expression that does not fit on a two-dimensional screen. Positioning the project in the concept of spatial computing using a speculative design and iterative design methodology allows me to shift focus from current industry practices and widely-available technology towards the critical and theoretical. This allows me to conceptualize and prototype possibilities of future innovations in computer interfaces used for a variety of purposes, guiding my future work.
Auger, James Henry. “Living With Robots: A Speculative Design Approach.” Journal of Human-Robot Interaction, vol. 3, no. 1, 28 Feb. 2014, pp. 20–42., doi:10.5898/jhri.3.1.auger.
Ball, Linden J., and Thomas C. Ormerod. “Putting Ethnography to Work: the Case for a Cognitive Ethnography of Design.” International Journal of Human-Computer Studies, vol. 53, no. 1, 2000, pp. 147–168., doi:10.1006/ijhc.2000.0372.
Boshernitsan, Marat, and Michael Sean Downes. Visual programming languages: A survey. Computer Science Division, University of California, 2004.
Donalek, Ciro, et al. “Immersive and Collaborative Data Visualization Using Virtual Reality Platforms.” 2014 IEEE International Conference on Big Data (Big Data), 2014, pp. 609–614., doi:10.1109/bigdata.2014.7004282.
Draskovic, Michael. “Can Spatial Computing Change How We View the Digital World?” Pacific Standard, 26 July 2017, psmag.com/news/can-spatial-computing-change-how-we-view-the-digital-world.
Durbin, Joe, et al. “Real-Time Coding In VR With ‘RiftSketch’ From Altspace Dev.” UploadVR, 12 Feb. 2016, uploadvr.com/riftsketch/.
Elliott, Anthony. “Using Virtual Reality to Create Software: A Likely Future.” Medium, Medium, 23 Jan. 2015, medium.com/@anthonyE_vr/using-virtual-reality-to-create-software-a-likely-future-9c4472108289
Finch, Sarah. “At A Glance – Spatial Computing.” Disruption Hub, Disruption Hub, 3 May 2018, disruptionhub.com/spatial-computing/.
Gobo, Giampietro, and Lukas T. Marciniak. “Ethnography.” Qualitative research 3.1 (2011): 15-36.
Grenny, Joseph, and David Maxfield. “A Study of 1,100 Employees Found That Remote Workers Feel Shunned and Left Out.” Harvard Business Review, 14 May 2018, hbr.org/2017/11/a-study-of-1100-employees-found-that-remote-workers-feel-shunned-and-left-out
Handy, Alex. “Dynamicland Radically Rethinks the Computer Interface.” The New Stack, 19 Jan. 2018, thenewstack.io/dynamicland-rethinks-computer-interfaces/.
Harris, Sam. “The Benefits and Pitfalls of Pair Programming in the Workplace.” FreeCodeCamp.org, FreeCodeCamp.org, 22 Aug. 2017, www.freecodecamp.org/news/the-benefits-and-pitfalls-of-pair-programming-in-the-workplace-e68c3ed3c81f/.
MacKenzie, I. Scott. Human-Computer Interaction: an Empirical Research Perspective. Morgan Kaufmann, 2013.
Metcalfe, Tom. “Futuristic ‘Hologram’ Tech Promises Ultra-Realistic Human Telepresence.” NBCNews.com, NBCUniversal News Group, 7 May 2018, www.nbcnews.com/mach/science/futuristic-hologram-tech-promises-ultra-realistic-human-telepresence-ncna871526.
Nagy, Danil. “Computational Design in Grasshopper.” Medium, Generative Design, 6 Feb. 2017, medium.com/generative-design/computational-design-in-grasshopper-1a0b62963690.
“Remote Material Nodes.” Unreal Engine Forums, 3 Mar. 2015, forums.unrealengine.com/unreal-engine/feedback-for-epic/31850-remote-material-nodes.
Repenning, Alexander, et al. “Beyond Minecraft: Facilitating Computational Thinking through Modeling and Programming in 3D.” IEEE Computer Graphics and Applications, vol. 34, no. 3, 12 May 2014, pp. 68–71., doi:10.1109/mcg.2014.46.
Victor, Bret. “Research Agenda and Former Floor Plan.” Communications Design Group SF, 10 Mar. 2014.