Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống
1
/ 45 trang
THÔNG TIN TÀI LIỆU
Thông tin cơ bản
Định dạng
Số trang
45
Dung lượng
7,41 MB
Nội dung
SensoryPropertiesinFusionofVisual/HapticStimuliUsingMixedReality 577 Results The estimated haptic and visual discrimination thresholds are shown in Tables 1 and 2. Visual discrimination thresholds are defined at 0.4-mm interval, but haptic discrimination thresholds are not. In order to investigate how visual/haptic stimuli interact with each other, common criteria are necessary. We define the common criteria as the minimum per- ceivable seven steps for both of visual/haptic discrimination thresholds, 0.2, 0.6, 1.0, 1.4, 1.8, 2.2, and 2.6 mm. Although, humans can discriminate differences less than 0.1 mm, due to the limitations of the accuracy processing machinery, it is not possible to estimate haptic discrimination thresholds less than 0.1 mm. By considering different thresholds of visual and haptic information, we quantize the scale of the curvature radii into seven identical parts: 0.2, 0.6, 1.0, 1.4, 1.8, 2.2, and 2.6 mm. Standard Stimulus (mm) Discrimination Threshold (mm) Standard Deviation 0.2 0.332 0.057 0.6 0.382 0.078 1.0 0.395 0.062 1.4 0.322 0.052 1.8 0.338 0.045 Table 1. Visual Discrimination Threshold Table 2. Haptic Discrimination Threshold 4.3 Procedure of Subjective Evaluation First, as a reference of the matching process, subjects were presented a standard stimulus in three ways: only haptic, only visual, and both. When subjects are indicated to observe the object by using only haptic, subjects close the eyes, and then the experimenter leads their hand onto the object. When subjects are indicated to observe the object by only vision, sub- jects watch the object with putting their hands on the experimental table. When subjects are indicated to observe the object by using haptic and vision, subjects can watch and touch the object without any physical constraints. After observing a standard stimulus, subjects are required to determine a corresponding stimulus by using only haptic, only vision, and both haptic and vision together. In all trials, subjects are permitted to take as much time as needed. The displayed stimuli for match-up are randomly chosen to control for order effects. When subjects require observing the next stimulus for match-up, the stimulus will be displayed after 15 seconds interval. Standard Stimulus (mm) Discrimination Threshold (mm) Standard Deviation 0.0 0.000 0.000 0.1 0.000 0.000 0.2 0.130 0.053 0.4 0.270 0.078 0.7 0.238 0.049 1.0 0.237 0.069 1.3 0.300 0.063 1.6 0.418 0.095 There are nine combinations of three displaying ways for standard stimulus and three dis- playing ways for mach-up stimuli. By executing multiple classification analysis to the result derived by the all combinations, we investigate whether human perception is affected by fusing visual and haptic cues. If we conduct the experiment by using all perceivable seven steps for both of visual/haptic discrimination thresholds, huge amount of time and labor is needed. On the other hand, it is difficult to extract significant evidence to show that human perception is affected by fusing visual and haptic cues in the most of the trials, when the one stimulus is too week to affect the other stimulus. Thus, we conduct a preliminary experiment to choose combinations of visual and haptic stimuli that can easily introduce the influence caused by the fusion. As the result, a combination {visual: 2.2 mm / haptic 1.4 mm} is selected. 4.4 Result and Discussion Results are illustrated in Figure 16 and Figure 17. The horizontal axis represents the types of matching procedures, and the vertical axis represents the mean evaluating value of the radii of edges. The line with rhombus nodes is the mean matching response when standard stim- uli are presented only by haptic, the line with triangle nodes is only using vision, and the line with box nodes is using haptic and vision together. In the first evaluation, subjects were given a 1.4 mm haptic curvature radius as a haptic stimulus and a 2.2 mm vision curvature radius as a visual stimulus. The result is shown in Figure 16. When subjects received a standard stimulus as a 1.4 mm haptic curvature radius and deter- mined a corresponding stimulus by only using haptic (the left rhombus node), they sensed it as 1.40±0.0 mm. On the other hand, when subjects received a standard stimulus as a 1.4 mm haptic curvature radius and a 2.2 mm vision one and determined a corresponding stimulus by only using haptic (the left box node), they sensed it as 1.64±0.2 mm by perceiving that the edge was blunter than the previous result. This result was derived by presenting a 2.2 mm vision stimulus as the standard stimulus. When subjects received a standard stimulus as a 2.2 mm vision curvature radius and deter- mined a corresponding stimulus by only using vision (the right triangle node), they sensed it as 2.20±0.0 mm. On the other hand, when subjects received a standard stimulus as a 1.4 mm haptic curvature radius and a 2.2 mm vision one and determined a corresponding stimulus by only using vision (the right box node), they sensed it as 2.12±0.4 mm by perceiv- ing that the edge was sharper than the previous result. This result was derived by present- ing a 1.4 mm haptic stimulus as the standard stimulus. When subjects received a standard stimulus as a 1.4 mm haptic curvature radius and a 2.2 mm vision one and determined a corresponding stimulus by using both haptic and vision (the middle box node), they sensed it as 1.84±0.1 mm. This experiment shows that the haptic stimulus seems to be affected by visual stimulus when discrepancy exists between vision and haptic stimuli. By applying the Student's t-test to our evaluation data, significance differences were found in effectiveness, caused by presenting a standard stimulus in three ways (F(2.18) = 26.694, p<0.05). AdvancesinHaptics578 Fig. 16. Mean grit sizes selected as matches for visual/haptic, and visual/haptic standards; subjects touched an object with a 1.4 mm haptic curvature radius and a 2.2 mm vision one. In the second evaluation, we switch the value of visual/haptic stimuli to control the order effect. Thus, a subject is given a 2.2 mm haptic curvature radius as a haptic stimulus and a 1.4 mm vision curvature radius as a visual stimulus. The result is shown in Figure 17. When subjects received a standard stimulus as a 2.2 mm haptic curvature radius and deter- mined a corresponding stimulus by only using haptic (the left rhombus node), they sensed it as 2.20±0.0 mm. On the other hand, when subjects received a standard stimulus as a 2.2 mm haptic curvature radius and a 1.4 mm vision one and determined a corresponding stimulus by only using haptic (the left box node), they sensed it as 2.16±0.2 mm by perceiving that the edge was sharper than the previous result. This result is derived by presenting a 1.4 mm vision stimulus as the standard stimulus. When subjects received a standard stimulus as a 2.2 mm haptic curvature radius and a 1.4 mm vision one and determined a corresponding stimulus by using both haptic and vision (the middle box node), they sensed it as 2.04±0.2 mm. This experiment shows that the haptic stimulus seems to be affected by visual stimulus when discrepancy exists between vision and haptic stimuli. By applying the Student's t-test to our evaluation data, significance differences were found in effectiveness, caused by presenting a standard stimulus in three ways, (F(2.18)=36.394, p<0.05). Fig. 17. Mean grit sizes selected as matches for visual/haptic, and visual/haptic standards; subjects touched an object with a 2.2 mm haptic curvature radius and a 1.4 mm vision one These results of subjective evaluations for the sharpness of a cube’s edge show that users perceive an edge to be controllable by presenting a duller or sharper CG edge. We calculated the occupancy rate of haptic and vision stimuli for the evaluations by using the method introduced in Lederman’s paper (Lederman & Abbott, 1981). Haptic and visual influences are calculated by the following equations: standard) Mean(Touch-standard)Vision Mean( standard)n Mean(Visio-standard)Vision Vision Mean( influence Haptic (1) standard) Mean(Touch-standard)Vision Mean( standard) Mean(Touch-standard)Vision Mean(Touch influence Visual (2) In these equations, Mean (Touch+Vision standard) is the mean evaluating value of the ra- dius of an edge calculated from all subject evaluations that were presented standard haptic and vision stimuli. Mean (Vision standard) is the mean evaluating value of the radius of an edge calculated from all subject evaluations that were presented a standard vision stimulus. Mean (Touch standard) is the mean evaluating value of the radius of an edge calculated from all evaluations that were presented a standard haptic stimulus. In the first evaluation, the occupancy rate of the vision stimulus is 57.1% and the haptic stimulus is 42.9%. In the second evaluation, the occupancy rate of the vision stimulus is 77.8% and the haptic stimulus is 22.2%. These results show that when a curvature radius becomes larger, the haptic sensation becomes duller. As a result, the occupancy rate of the vision stimulus increases. SensoryPropertiesinFusionofVisual/HapticStimuliUsingMixedReality 579 Fig. 16. Mean grit sizes selected as matches for visual/haptic, and visual/haptic standards; subjects touched an object with a 1.4 mm haptic curvature radius and a 2.2 mm vision one. In the second evaluation, we switch the value of visual/haptic stimuli to control the order effect. Thus, a subject is given a 2.2 mm haptic curvature radius as a haptic stimulus and a 1.4 mm vision curvature radius as a visual stimulus. The result is shown in Figure 17. When subjects received a standard stimulus as a 2.2 mm haptic curvature radius and deter- mined a corresponding stimulus by only using haptic (the left rhombus node), they sensed it as 2.20±0.0 mm. On the other hand, when subjects received a standard stimulus as a 2.2 mm haptic curvature radius and a 1.4 mm vision one and determined a corresponding stimulus by only using haptic (the left box node), they sensed it as 2.16±0.2 mm by perceiving that the edge was sharper than the previous result. This result is derived by presenting a 1.4 mm vision stimulus as the standard stimulus. When subjects received a standard stimulus as a 2.2 mm haptic curvature radius and a 1.4 mm vision one and determined a corresponding stimulus by using both haptic and vision (the middle box node), they sensed it as 2.04±0.2 mm. This experiment shows that the haptic stimulus seems to be affected by visual stimulus when discrepancy exists between vision and haptic stimuli. By applying the Student's t-test to our evaluation data, significance differences were found in effectiveness, caused by presenting a standard stimulus in three ways, (F(2.18)=36.394, p<0.05). Fig. 17. Mean grit sizes selected as matches for visual/haptic, and visual/haptic standards; subjects touched an object with a 2.2 mm haptic curvature radius and a 1.4 mm vision one These results of subjective evaluations for the sharpness of a cube’s edge show that users perceive an edge to be controllable by presenting a duller or sharper CG edge. We calculated the occupancy rate of haptic and vision stimuli for the evaluations by using the method introduced in Lederman’s paper (Lederman & Abbott, 1981). Haptic and visual influences are calculated by the following equations: standard) Mean(Touch-standard)Vision Mean( standard)n Mean(Visio-standard)Vision Vision Mean( influence Haptic (1) standard) Mean(Touch-standard)Vision Mean( standard) Mean(Touch-standard)Vision Mean(Touch influence Visual (2) In these equations, Mean (Touch+Vision standard) is the mean evaluating value of the ra- dius of an edge calculated from all subject evaluations that were presented standard haptic and vision stimuli. Mean (Vision standard) is the mean evaluating value of the radius of an edge calculated from all subject evaluations that were presented a standard vision stimulus. Mean (Touch standard) is the mean evaluating value of the radius of an edge calculated from all evaluations that were presented a standard haptic stimulus. In the first evaluation, the occupancy rate of the vision stimulus is 57.1% and the haptic stimulus is 42.9%. In the second evaluation, the occupancy rate of the vision stimulus is 77.8% and the haptic stimulus is 22.2%. These results show that when a curvature radius becomes larger, the haptic sensation becomes duller. As a result, the occupancy rate of the vision stimulus increases. AdvancesinHaptics580 6. Conclusion This chapter introduced a system that can present visual/haptic sensory fusion using mixed reality. We investigated whether visual cues affect haptic cues. As a procedure to analyze sensory properties, we focused on two features of objects. One is the impression of texture that is intimately involved in the impression of products. The other is the sharpness of edge, which is strongly affected by both visual and haptic senses. From the result of the subjective evaluation on the impression of visual/haptic texture, we can derive an interesting assump- tion as follows; if we have learned from past experience that a material may sometimes have different haptic impressions (e.g., smooth and rough), we can control the haptic impression of a real object with the material by changing the visual texture overlaid on the object. Pre- liminary results of subjective evaluations on the sharpness of edge show that users perceive an edge to be duller or sharper than a real one when presented with an overlaid CG edge with a duller/sharper curvature. 7. References Adams, WJ.; Banks, MS. & Van, Ee R. (2001). Adaptation to 3D distortions in human vision, Nature Neuro-science, Vol.4 (1063-1064) Biocca, F.; Kim, J. & Choi, Y. (2001). Visual Touch in Virtual Environments: An Exploratory Study of Presence, Multimodal Interfaces, and Cross-Modal Sensory Illusions, MIT Press, Presence, Vol.10, No.3 (247-265), June Fiorentino, M.; de Amicis, R.; Monno, G. & A. Stork. (2002). Spacedesign: a Mixed Reality Workspace for Aesthetic Industrial Design, Proceedings. of International Symposium on Mixed and Augmented Reality (ISMAR02), (86-95) Friedrich. W. (2002). ARVIKA-Augmented Reality for Development, Production and Service, Proceedings. of International Symposium on Mixed and Augmented Reality (ISMAR02), (3-4) Hillis, J. M.; Ernst, M. O.; Banks, M. S. & Landy, M. S. (2002). Combining Sensory Information: Mandatory Fusion Within, but Not Between, Senses, Science, Vol.298, (1627-1630) Itoh, M.; Ozeki, M.; Nakamura, Y. & Ohta, Y. (2003). Simple and Robust Tracking of Hands and Objects for Video Indexing, Proceedings. of IEEE Conference. on Multisensor Fu- sion and Integration for Intelligent Systems (MFI), (252-257) Kato, H. & Billinghurst, M. (1999). Marker tracking and HMD calibration for a video-based augmented reality conferencing system. Proceedings. of International Workshop on Augmented Reality (IWAR99), ACM, (85–94) Lederman, S. J. & Abbott, S. G. (1981). Texture Perception: Studies of Intersensory Organiza- tion Using a Discrepancy Paradigm, and Visual Versus Tactual Psychophysics, Journal of Experimental Psychology: Human Perception and Performance, Vol.7, No. 4, (902-915) Lee, W. & Park, J. (2005). Augmented Foam: a Tangible Augmented Reality for Product Design, Proceedings of International Symposium on Mixed and Augmented Reality (IS- MAR05), (106- 109) Nakahara, M.; Kitahara, I. & Ohta, Y. (2007). ensory Property in Fusion of Visual/Haptic Cues by Using Mixed Reality, Second Joint Conference, EuroHaptics Conference 2007 and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (World Haptics 2007), (565-566) Navab. N. (2003). Industrial Augmented Reality (IAR): Challenges in Design and Commer- cialization of Killer Apps, Proc.eedings of International Symposium on Mixed and Aug- mented Reality (ISMAR03), (2-6) Nolle, S. & Klinker. G. (2006). Augmented Reality as a Comparison Tool in Automotive Industry, Proceedings. of International Symposium on Mixed and Augmented Reality (ISMAR06), (249-250) Ohta、Y. & Tamura, H. (1999). Mixed Reality–Merging Real and Virtual Worlds-, Ohmsha, Ltd. Rock, I. & Harris, C. S. (1967). Vision and touch. Scientific American, Vol.216 (96-104), May Rock, I. & Victor, J. (1964). Vision and touch: An experimentally created conflict between the two senses. Science, Vol.143 (594-596) Sandor, C.; Uchiyama, S. & Yamamoto, H. (2007). Visuo-Haptic Systems: Half-Mirrors Con- sidered Harmful, Second Joint Conference, EuroHaptics Conference 2007 and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (World Haptics 2007), (292-297) Wang, Y & MacKenzie, C. L. (2000). The Role of Contextual Haptic and Visual Constraints on Object Manipulation in Virtual Environments, Proceedings. of the SIGGCHI confer- ence on Human factors in Computing Systems, (532-539) Wiedenmaier, S. O.; Oehme, L.; Schmidt, H. & Luczak, H. (2001). Augmented Reality (AR) for Assembly Processes - an Experimental Evaluation, Proceedings. of IEEE and ACM International Symposium on Augmented Reality (ISAR2001), (185-186) SensoryPropertiesinFusionofVisual/HapticStimuliUsingMixedReality 581 6. Conclusion This chapter introduced a system that can present visual/haptic sensory fusion using mixed reality. We investigated whether visual cues affect haptic cues. As a procedure to analyze sensory properties, we focused on two features of objects. One is the impression of texture that is intimately involved in the impression of products. The other is the sharpness of edge, which is strongly affected by both visual and haptic senses. From the result of the subjective evaluation on the impression of visual/haptic texture, we can derive an interesting assump- tion as follows; if we have learned from past experience that a material may sometimes have different haptic impressions (e.g., smooth and rough), we can control the haptic impression of a real object with the material by changing the visual texture overlaid on the object. Pre- liminary results of subjective evaluations on the sharpness of edge show that users perceive an edge to be duller or sharper than a real one when presented with an overlaid CG edge with a duller/sharper curvature. 7. References Adams, WJ.; Banks, MS. & Van, Ee R. (2001). Adaptation to 3D distortions in human vision, Nature Neuro-science, Vol.4 (1063-1064) Biocca, F.; Kim, J. & Choi, Y. (2001). Visual Touch in Virtual Environments: An Exploratory Study of Presence, Multimodal Interfaces, and Cross-Modal Sensory Illusions, MIT Press, Presence, Vol.10, No.3 (247-265), June Fiorentino, M.; de Amicis, R.; Monno, G. & A. Stork. (2002). Spacedesign: a Mixed Reality Workspace for Aesthetic Industrial Design, Proceedings. of International Symposium on Mixed and Augmented Reality (ISMAR02), (86-95) Friedrich. W. (2002). ARVIKA-Augmented Reality for Development, Production and Service, Proceedings. of International Symposium on Mixed and Augmented Reality (ISMAR02), (3-4) Hillis, J. M.; Ernst, M. O.; Banks, M. S. & Landy, M. S. (2002). Combining Sensory Information: Mandatory Fusion Within, but Not Between, Senses, Science, Vol.298, (1627-1630) Itoh, M.; Ozeki, M.; Nakamura, Y. & Ohta, Y. (2003). Simple and Robust Tracking of Hands and Objects for Video Indexing, Proceedings. of IEEE Conference. on Multisensor Fu- sion and Integration for Intelligent Systems (MFI), (252-257) Kato, H. & Billinghurst, M. (1999). Marker tracking and HMD calibration for a video-based augmented reality conferencing system. Proceedings. of International Workshop on Augmented Reality (IWAR99), ACM, (85–94) Lederman, S. J. & Abbott, S. G. (1981). Texture Perception: Studies of Intersensory Organiza- tion Using a Discrepancy Paradigm, and Visual Versus Tactual Psychophysics, Journal of Experimental Psychology: Human Perception and Performance, Vol.7, No. 4, (902-915) Lee, W. & Park, J. (2005). Augmented Foam: a Tangible Augmented Reality for Product Design, Proceedings of International Symposium on Mixed and Augmented Reality (IS- MAR05), (106- 109) Nakahara, M.; Kitahara, I. & Ohta, Y. (2007). ensory Property in Fusion of Visual/Haptic Cues by Using Mixed Reality, Second Joint Conference, EuroHaptics Conference 2007 and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (World Haptics 2007), (565-566) Navab. N. (2003). Industrial Augmented Reality (IAR): Challenges in Design and Commer- cialization of Killer Apps, Proc.eedings of International Symposium on Mixed and Aug- mented Reality (ISMAR03), (2-6) Nolle, S. & Klinker. G. (2006). Augmented Reality as a Comparison Tool in Automotive Industry, Proceedings. of International Symposium on Mixed and Augmented Reality (ISMAR06), (249-250) Ohta、Y. & Tamura, H. (1999). Mixed Reality–Merging Real and Virtual Worlds-, Ohmsha, Ltd. Rock, I. & Harris, C. S. (1967). Vision and touch. Scientific American, Vol.216 (96-104), May Rock, I. & Victor, J. (1964). Vision and touch: An experimentally created conflict between the two senses. Science, Vol.143 (594-596) Sandor, C.; Uchiyama, S. & Yamamoto, H. (2007). Visuo-Haptic Systems: Half-Mirrors Con- sidered Harmful, Second Joint Conference, EuroHaptics Conference 2007 and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (World Haptics 2007), (292-297) Wang, Y & MacKenzie, C. L. (2000). The Role of Contextual Haptic and Visual Constraints on Object Manipulation in Virtual Environments, Proceedings. of the SIGGCHI confer- ence on Human factors in Computing Systems, (532-539) Wiedenmaier, S. O.; Oehme, L.; Schmidt, H. & Luczak, H. (2001). Augmented Reality (AR) for Assembly Processes - an Experimental Evaluation, Proceedings. of IEEE and ACM International Symposium on Augmented Reality (ISAR2001), (185-186) AdvancesinHaptics582 ExpandingtheScopeofInstantMessagingwithBidirectionalHapticCommunication 583 Expanding the Scope of Instant Messaging with Bidirectional Haptic Communication YoungjaeKimandMinsooHahn X Expanding the Scope of Instant Messaging with Bidirectional Haptic Communication Youngjae Kim and Minsoo Hahn Korea Advanced Institute of Science and Technology Korea, Republic of 1. Introduction For the past five years, haptic interfaces have been applied to various commercial products. Most consumers are now familiar with the term haptic. Many among them use vibro-tactile feedback equipped touchscreen devices, although they may not have a clear understanding of what it is. According to the Google Trend result (http://www.google.com/trends/), Korean people type in and search for the keyword haptic more frequently than people in other countries. The traffic gaps between Korea and other countries are as follows. Region Traffic std error City Traffic std error South Korea 1 0% Seoul (South Korea) 1 0% Vietnam 0.475 5% Singapore (Singapore) 0.435 5% Singapore 0.395 5% Jakarta (Indonesia) 0.22 10% Malaysia 0.25 5% Ottawa (Canada) 0.21 10% Philippines 0.23 5% Bangkok (Thailand) 0.2 10% Thailand 0.195 5% Hong Kong (Hong Kong) 0.175 10% Indonesia 0.18 10% Delhi (India) 0.115 10% Hong Kong 0.18 10% Seattle (USA) 0.115 10% Taiwan 0.145 5% San Francisco (USA) 0.115 10% India 0.14 5% Los Angeles (USA) 0.11 10% Table 1. Google Trend result on the keyword haptic (data acquired on Aug. 31, 2009) In Table 1, the numbers in the Traffic column represent the relative values calculated upon the most dominant region (in this case, South Korea). As can be seen in Table 1, the search traffic of South Korea is twice higher than those of other countries such as Vietnam, Singapore, and the USA. It is mainly due to the marketing strategy of local cellular phone manufacturers that included the term haptic in their product names. The important point is not only that people are becoming familiar with the keyword, but also that many research and industry fields are starting to focus on haptic and its effects. For example, a car manufacturer may try to apply a haptic interface to the navigation controller, or a bank may introduce ATM’s with a newly installed haptic feedback-equipped touchscreen. In short, haptic technology is making gradual changes in our daily lifestyle. 31 AdvancesinHaptics584 The initial goal of haptic technology is to facilitate the manipulation of devices. A vibro-tactile feedback enables a user to control a device more accurately and easily. For the next step, haptic aims to give intuitiveness to control target devices. This is mainly because, from a cognitive point of view, users expect a kind of reaction if he or she tries to command to the target. Haptic technologies are widely employed in many areas these days, but in this chapter, we will focus on its communication usage only. As shown in many studies, haptic can be a type of daily messaging behaviours. Computer-mediated messaging technologies continue to evolve rapidly, and various types of messaging services are being marketed including short message services (SMS’s) provided on a cellular phone, message-oriented networking services such as Twitter (Java et al. 2007), blogs with trackback and reply systems, and instant messenger applications that enable peer-to-peer communication in real-time. More innovative types of messaging will continue to emerge (Poupyrev, Nashida, and Okabe 2007). Regardless of the type of messaging, all services share a common goal of diversifying communications among people (Vilhjálmsson 2003). This study aims to improve messaging experiences more realistic by adding a framework for haptic interaction. The term haptic means pertaining to the sense of touch, and thus haptic communication can be described as “communicating via touching”. Bonanni had an insight into this concept and tried to implement it (Bonanni et al. 2006). He had studied the way to convey sensations from peer to peer. Rovers had introduced the vibro-tactile-pattern-embedded emoticon named HIM (A. F. Rovers and Van Essen 2004). His research method is quite similar to that proposed in this chapter. The vibro-tactile pattern is embedded into an emoticon so that users can feel more realistic sensations while engaged in instant messaging. VibeTonz (Immersion Corp 2007) is a commercialized vibro-tactile composer from Immersion. As a cellular phone with a touch screen or a conductive switch is being produced by a number of manufacturers these days, Immersion’s VibeTonz technology is actively employed. VibeTonz can compose tactile output patterns along with a timeline. Although many researches led to touch-enabled emoticons (Chang et al. 2002; L. Rovers and Van Essen 2004; Aleven et al. 2006), most of these researches were limited to conveying vibro-tactile actuation. The component of touch and related sensations encompass not only tactile stimulus, but also temperature, sound, etc. For this reason, a framework to send and to receive the whole spectrum of haptic is strongly required. The objective of this research is to facilitate haptic communications among users and expand the scope of the computer-mediated conversation. The bidirectional haptic means that a sensor and an actuator can be manipulated on a single framework. This is a simple concept, but most researches tend to focus on one side only. To achieve true haptic communication, a system providing both a sensor and an actuator within a single framework is needed. Brave introduced in-Touch (Brave and Dahley 1997) to synchronize each cylinder-like device. Two devices are connected and have both a sensor and an actuator in one single tangible object. When one user rolls one device, the motor in the other part starts to run. HAML (El-Far et al. 2006) is a haptic markup language which centers on the haptic description. This is a technical specification that tries to elevate to the MPEG standards. In this research, the Phantom device is mainly applied. HAMLET (Mohamad Eid et al. 2008) is a HAML-based authoring tool. Both HAMLET and this research aim to accomplish simplicity and efficiency in utilizing haptic for non-programmer developers and artists. However, our target users are rather general users than those of HAMLET, who uses the instant messenger as a daily communication tool. From the view of the description language, or the markup language, SensorML (Botts and Robin 2007) is one of the specifications to describe a sensor. The object of this markup language is to provide the sensor information as detailed as possible including the manufacturer, hardware specifications, the data type to acquire a result, etc. It can be adopted into our work, but we concluded it is too verbose to apply this SensorML to our work. In this study, TouchCon, a next-generation emoticon for haptic-embedded communication, is proposed. The architecture of the framework to represent haptic expressions in our daily messaging and chatting is also provided. In addition, included is the hardware specially designed for testing and the summary of user preference surveys with reference to the previous researches (Kim et al. 2009; Kim et al. 2009; Shin et al. 2007). 2. A Platform for Managing Haptic Communication 2.1 Overall Description The proposed system enables a user to manipulate haptic interaction and to share it with others. To achieve this goal, we need to summarize the requirements of the system. The system needs to support haptic actuator control, sensor data acquisition, linkage with various applications, library management, etc. One important goal of this study is to resolve the haptic expression even when two devices are not identical. For this reason, the haptic communication framework has been designed to achieve the flexibility and the scalability. The flexibility allows the framework to invite and to manipulate different devices. To support haptic-enabled hardwares, the framework must be capable of providing a standardized gateway. Thus, the architecture adopted here has a similar goal to the middleware system (Baldauf, Dustdar, and Rosenberg 2007) from the architectural point of view. The scalability means, the framework is extensible to adopt various sensors and actuators according to their descriptions. For that, the framework has to allow various protocols. Figure 1 shows the overall architecture of the platform. Fig. 1. Overall TouchCon architecture ExpandingtheScopeofInstantMessagingwithBidirectionalHapticCommunication 585 The initial goal of haptic technology is to facilitate the manipulation of devices. A vibro-tactile feedback enables a user to control a device more accurately and easily. For the next step, haptic aims to give intuitiveness to control target devices. This is mainly because, from a cognitive point of view, users expect a kind of reaction if he or she tries to command to the target. Haptic technologies are widely employed in many areas these days, but in this chapter, we will focus on its communication usage only. As shown in many studies, haptic can be a type of daily messaging behaviours. Computer-mediated messaging technologies continue to evolve rapidly, and various types of messaging services are being marketed including short message services (SMS’s) provided on a cellular phone, message-oriented networking services such as Twitter (Java et al. 2007), blogs with trackback and reply systems, and instant messenger applications that enable peer-to-peer communication in real-time. More innovative types of messaging will continue to emerge (Poupyrev, Nashida, and Okabe 2007). Regardless of the type of messaging, all services share a common goal of diversifying communications among people (Vilhjálmsson 2003). This study aims to improve messaging experiences more realistic by adding a framework for haptic interaction. The term haptic means pertaining to the sense of touch, and thus haptic communication can be described as “communicating via touching”. Bonanni had an insight into this concept and tried to implement it (Bonanni et al. 2006). He had studied the way to convey sensations from peer to peer. Rovers had introduced the vibro-tactile-pattern-embedded emoticon named HIM (A. F. Rovers and Van Essen 2004). His research method is quite similar to that proposed in this chapter. The vibro-tactile pattern is embedded into an emoticon so that users can feel more realistic sensations while engaged in instant messaging. VibeTonz (Immersion Corp 2007) is a commercialized vibro-tactile composer from Immersion. As a cellular phone with a touch screen or a conductive switch is being produced by a number of manufacturers these days, Immersion’s VibeTonz technology is actively employed. VibeTonz can compose tactile output patterns along with a timeline. Although many researches led to touch-enabled emoticons (Chang et al. 2002; L. Rovers and Van Essen 2004; Aleven et al. 2006), most of these researches were limited to conveying vibro-tactile actuation. The component of touch and related sensations encompass not only tactile stimulus, but also temperature, sound, etc. For this reason, a framework to send and to receive the whole spectrum of haptic is strongly required. The objective of this research is to facilitate haptic communications among users and expand the scope of the computer-mediated conversation. The bidirectional haptic means that a sensor and an actuator can be manipulated on a single framework. This is a simple concept, but most researches tend to focus on one side only. To achieve true haptic communication, a system providing both a sensor and an actuator within a single framework is needed. Brave introduced in-Touch (Brave and Dahley 1997) to synchronize each cylinder-like device. Two devices are connected and have both a sensor and an actuator in one single tangible object. When one user rolls one device, the motor in the other part starts to run. HAML (El-Far et al. 2006) is a haptic markup language which centers on the haptic description. This is a technical specification that tries to elevate to the MPEG standards. In this research, the Phantom device is mainly applied. HAMLET (Mohamad Eid et al. 2008) is a HAML-based authoring tool. Both HAMLET and this research aim to accomplish simplicity and efficiency in utilizing haptic for non-programmer developers and artists. However, our target users are rather general users than those of HAMLET, who uses the instant messenger as a daily communication tool. From the view of the description language, or the markup language, SensorML (Botts and Robin 2007) is one of the specifications to describe a sensor. The object of this markup language is to provide the sensor information as detailed as possible including the manufacturer, hardware specifications, the data type to acquire a result, etc. It can be adopted into our work, but we concluded it is too verbose to apply this SensorML to our work. In this study, TouchCon, a next-generation emoticon for haptic-embedded communication, is proposed. The architecture of the framework to represent haptic expressions in our daily messaging and chatting is also provided. In addition, included is the hardware specially designed for testing and the summary of user preference surveys with reference to the previous researches (Kim et al. 2009; Kim et al. 2009; Shin et al. 2007). 2. A Platform for Managing Haptic Communication 2.1 Overall Description The proposed system enables a user to manipulate haptic interaction and to share it with others. To achieve this goal, we need to summarize the requirements of the system. The system needs to support haptic actuator control, sensor data acquisition, linkage with various applications, library management, etc. One important goal of this study is to resolve the haptic expression even when two devices are not identical. For this reason, the haptic communication framework has been designed to achieve the flexibility and the scalability. The flexibility allows the framework to invite and to manipulate different devices. To support haptic-enabled hardwares, the framework must be capable of providing a standardized gateway. Thus, the architecture adopted here has a similar goal to the middleware system (Baldauf, Dustdar, and Rosenberg 2007) from the architectural point of view. The scalability means, the framework is extensible to adopt various sensors and actuators according to their descriptions. For that, the framework has to allow various protocols. Figure 1 shows the overall architecture of the platform. Fig. 1. Overall TouchCon architecture [...]... instant messaging In Proceedings of EuroHaptics, 4: Vol 4 Russinovich, M E., and D A Solomon 2005 Microsoft Windows Internals, Microsoft Windows Server 2003, Windows XP, and Windows 2000 Microsoft Press Shin, H., J Lee, J Park, Y Kim, H Oh, and T Lee 2007 A Tactile Emotional Interface for Instant Messenger Chat Lecture Notes in Computer Science 4558: 166 Vilhjálmsson, H H 2003 Avatar Augmented Online... Virtual Environments Hanqiu SUN Department of Computer Science & Engineering, The Chinese University of Hong Kong, Hong Kong Hui CHEN Shenzhen Institute of Advanced Integration Technology, Chinese Academy of Sciences / The Chinese University of Hong Kong, China 1 Introduction Simulating interactive behavior of objects such as soft tissues in surgical simulation or in control engine of VR applications has... Most haptic devices utilize point interactions, resulting in a conflict between the low information bandwidth and further complication of data exploration Unlike our sense of vision, haptic manipulation involves direct interaction with objects being explored, providing the most intuitive way of applying 3D manipulation in virtual scenes Utilizing multi-resolution methods in haptic display provides a... Massachusetts Institute of Technology Youngjae Kim, Heesook Shin, and Minsoo Hahn 2009 A bidirectional haptic communication framework and an authoring tool for an instant messenger In Advanced Communication Technology, 2009 ICACT 2009 11th International Conference on, 03:2050-2053 Vol 03 Realistic Haptics Interaction in Complex Virtual Environments 603 32 X Realistic Haptics Interaction in Complex Virtual... point i, di is the damping constant of the same point, rij is the vector distance between point i and point j, lij is the rest length ,and σij is the stiffness of the spring connecting two mass points The right-hand term Fi is the sum of other external forces The motion equations for the entire system are assembled through concatenating the position vectors of the N individual mass points into a single... http://www.immersion.com/docs /haptics_ mobile-ue_nov07v1.pdf Java, A., X Song, T Finin, and B Tseng 2007 Why we twitter: understanding microblogging usage and communities In Proceedings of the 9th WebKDD and 1st SNA-KDD 2007 workshop on Web mining and social network analysis, 56-65 ACM New York, NY, USA Jung, Chanhee 2008 Design of Vibro-tactile Patterns for Emotional Expression in Online Environments Thesis... XML schemas in order to manage haptic commands and to activate haptic-enabled hardwares Three factors must be taken into consideration to design schemas 588 Advances in Haptics - Scalability: To include an abundance of haptic interactions and to support a combination of sensors and actuators, scalability must be considered in the system This is the main reason why the XML format is adopted in this study... given shape including the hand or fingers Most studies simplified the haptic tool-object interaction paradigm into multiple point contacts (Colgate et al., 1995), which provide a convenient simplification because the system needs only render forces resulting from contact between the tool’s avatar and objects in the environment The force feedbacks are generated based on the spring/damping linked to the... descriptions of the scene in a haptic environment based on the affine median filter, providing users with view of varying resolution scene Zhang et al (Zhang et al., 2002) applied haptic rendering in different detail levels of soft object by subdividing the area of interest on a relatively coarse mesh model and evaluated the spring constants after haptic subdivision Otaduy & Lin (Otaduy & Lin, 2003) provided... most 606 Advances in Haptics simulated palpation forces were reduced to point-based interaction model with springdamper linkage to simulate the contact between one or more fingertips and the virtual object Some special haptics device was created and applied in breast palpation simulation The contact problem between two elastic solids that are pressed by the applied force was first solved by Hertz in 1882 . introduce ATM’s with a newly installed haptic feedback-equipped touchscreen. In short, haptic technology is making gradual changes in our daily lifestyle. 31 Advances in Haptics5 84 The initial. Experimental Evaluation, Proceedings. of IEEE and ACM International Symposium on Augmented Reality (ISAR2001), (185-186) Advances in Haptics5 82 ExpandingtheScopeofInstantMessagingwithBidirectionalHapticCommunication. soft-feeling silicon material. In addition, we found that the silicon finish could prevent the user from being injured by embedded heater units. ExpandingtheScopeofInstantMessagingwithBidirectionalHapticCommunication