Driver Usb Irda Sangha Space

supernewbeat.bitballoon.comDriver Usb Irda Sangha Space ► ► ►
Driver Usb Irda Sangha Space 4,1/5 631reviews

Polar IrDA USB Adapter Drivers Polar IrDA USB Adapter is supported for Windows 98SE/ME/2000/XP 32/64-bit, Widnows Vista, Windows 7 32/64-bit and Windows 8 32/64-bit. Please make sure you have updated Microsoft Service Pack before installation. Download the drivers from. Save the.zip file to a temporary location and extract it with, for example, WinZip. See user manual for installation instructions. After installing drivers, restart your computer.

Q: how much hardrive space can be added to a dell lattitude cpx laptop? Robertjones, Respond. Q: i have a fujitsu. Q: I Have a Toshiba A10 Satelite pro and a Mercer USB to IrDA infla red and Windows XP Professional as operating systen and is looking for drivers to make this work, Pieter Reichert, Respond. Incarcat de Accesari 1109 Data 30.10.10 Marime 5.1 MB Browserul tau nu suporta HTML5.

Driver Usb Irda Sangha Space

When installing drivers to Windows Vista/7 64 bit / Windows 8, please make sure that you are logged onto your computer as an administrator. Admin rights are necessary to install the drivers. Do not use these drivers for the silver-colored ( picture below).

Disclosed are a vehicle and a control method for the same. The vehicle surround monitoring device (100) of claim 10 or 11, wherein the controller (170) is configured to, based on the determination that one of the one or more sub-areas overlaps the predicted route of the vehicle, perform one or more of the following operations: (i) providing a warning to a driver of the vehicle, (ii) generating speed control information for the vehicle, (iii) generating steering control information for the vehicle, and (iv) generating light control information for the vehicle. The vehicle surround monitoring device (100) of claim 12, wherein the controller (170) is configured to perform a first operation, among the one or more operations, for the sub-area that overlaps the predicted route of the vehicle, wherein the controller (170) is preferably configured to adjust control parameters for at least one of the warning, the speed control information, the steering control information, and the light control information based on a distance between the vehicle and the sub-area that overlaps the predicted route of the vehicle. In order to increase the safety and convenience of a user who uses the vehicle, technology to equip vehicles with, for example, a variety of sensors and electronic devices is being aggressively developed.

In particular, for example, systems, which provide various functions (e.g. Smart cruise control and lane keeping assistance) developed for user driving convenience, have been mounted in vehicles. Thereby, so-called autonomous driving, which enables a vehicle to autonomously travel on the road in consideration of the external environment without user operation, has recently become possible. However, the related art is adapted to detect moving objects around a vehicle and thereafter simply inform the driver of the presence of the objects. That is, despite the fact that the degree of the risk that the moving object will collide with the vehicle differs according to the inherent speed, movement direction or type of the object, the object detection method of the related art provides the driver with simple information related to the risk of a collision between each moving object and the vehicle, which does not help the driver preemptively respond to the risk of an accident.

The above and other objects, features and advantages of the present invention will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings, in which: • FIG. 1 is a block diagram of a vehicle according to one embodiment of the present invention; • FIG. 2 is a view illustrating the external appearance of a vehicle according to one embodiment of the present invention, for convenience of description, the vehicle being assumed to be a four-wheeled vehicle; • FIGs. 3A to 3C are views referenced to describe an external camera illustrated in FIG.

4 is a view illustrating one example of the vehicle illustrated in FIG. 1, for convenience of description, the vehicle being assumed to be a four-wheeled vehicle; • FIG. 5 is an exemplary block diagram illustrating the interior configuration of a controller illustrated in FIG. 6A and 6B are views referenced to describe the operation of the controller illustrated in FIG.

7 is a flowchart illustrating an exemplary process to be performed by the vehicle according to one embodiment of the present invention; • FIGs. 8A and 8B are views illustrating one example of a dangerous area set on a per-moving-object-type basis according to one embodiment of the present invention; • FIG. 9 is a view illustrating another example of a dangerous area set on a per-moving-object-type basis according to one embodiment of the present invention; • FIG.

10 is a view illustrating a further example of a dangerous area set on a per-moving-object-type basis according to one embodiment of the present invention; • FIG. 11 is a view illustrating an exemplary method of changing, by the vehicle, a dangerous area for a moving object based on the speed of the moving object according to one embodiment of the present invention; • FIGs. 12A and 12B are views illustrating an exemplary method of changing, by the vehicle, a dangerous area based on the predicted movement direction of the moving object according to one embodiment of the present invention; • FIG. 13 is a view illustrating one example of a user interface screen provided to a user via a display unit of the vehicle according to one embodiment of the present invention; • FIGs. 14A to 14C are views illustrating a method of displaying, by the vehicle, an image corresponding to a dangerous area according to one embodiment of the present invention; • FIGs. 15A and 15B are views respectively illustrating a method of adjusting, by the vehicle, the scale of a map based on the surrounding congestion in a top-view mode according to one embodiment of the present invention; • FIG. 16 is a view illustrating a data table which defines the relationship between sub areas included in a dangerous area and functions according to one embodiment of the present invention; • FIG.

17 is a view illustrating one example of a map on which a dangerous area for a moving object is displayed according to one embodiment of the present invention; • FIG. 18 is a view illustrating one example of a map on which a dangerous area for the moving object associated with FIG. 17 is displayed according to one embodiment of the present invention; • FIG.

19 is a view illustrating one example of a map on which a dangerous area for the moving object associated with FIG. 18 is displayed according to one embodiment of the present invention; • FIG. 20 is a view illustrating one example of a map which is displayed when the vehicle executes an emergency braking function in association with FIG. 21 is a view illustrating one example of a map on which a dangerous area for the moving object associated with FIG.

19 is displayed according to one embodiment of the present invention; • FIG. 22 is a view illustrating one example of a map which is displayed when the vehicle executes an emergent braking function and an emergent steering function in association with FIG. 23 is a view illustrating one example in which the vehicle determines a control parameter for a specific function based on a dangerous area for a moving object according to one embodiment of the present invention; • FIG.

24 is a view illustrating another example in which the vehicle determines a control parameter for a specific function based on a dangerous area for the moving object associated with FIG. 23 according to one embodiment of the present invention; • FIGs.

25A and 25B are views illustrating one example in which the vehicle indicates the risk of a collision with a moving object in an augmented-reality mode according to one embodiment of the present invention; • FIG. 26 is a view illustrating the concept of V2X communication that may be performed by the vehicle according to one embodiment of the present invention; and • FIGs. 27A and 27B are views illustrating one example in which the vehicle uses outside illuminance to send a warning signal to a moving object that is at risk of colliding with the vehicle according to one embodiment of the present invention. DETAILED DESCRIPTION•. Hereinafter, the embodiments disclosed in the present specification will be described in detail with reference to the accompanying drawings, and the same or similar elements are denoted by the same reference numerals even though they are depicted in different drawings and redundant descriptions thereof will be omitted. In the following description, with respect to constituent elements used in the following description, suffixes 'module' and 'unit' are given or mingled with each other only in consideration of ease in the preparation of the specification, and do not have or serve as different meanings.

Accordingly, the suffixes 'module' and 'unit' may be mingled with each other. In addition, in the following description of the embodiments disclosed in the present specification, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the embodiments disclosed in the present specification rather unclear. In addition, the accompanying drawings are provided only for a better understanding of the embodiments disclosed in the present specification and are not intended to limit technical ideas disclosed in the present specification. Therefore, it should be understood that the accompanying drawings include all modifications, equivalents and substitutions included in the scope and sprit of the present invention. It will be understood that when a component is referred to as being 'connected to' or 'coupled to' another component, it may be directly connected to or coupled to another component or intervening components may be present. In contrast, when a component is referred to as being 'directly connected to' or 'directly coupled to' another component, there are no intervening components present.

In addition, it will be understood that when a component is referred to as 'controlling' another component, it may directly control another component, or may also control another component via the mediation of a third component. In addition, it will be understood that when a component is referred to as 'providing' another component with information and signals, it may directly provide another component with the same and may also provide another component the same via the mediation of a third component. Examples of such wireless Internet technologies include Wireless LAN (WLAN), Wireless Fidelity (Wi-Fi), Wi-Fi Direct, Digital Living Network Alliance (DLNA), Wireless Broadband (WiBro), Worldwide Interoperability for Microwave Access (WiMAX), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE), and LTE-A (Long Term Evolution-Advanced).

The wireless Internet module 112 transmits and receives data according to one or more of such wireless Internet technologies and other Internet technologies as well. For example, the wireless Internet module 112 may exchange data with the external server in a wireless manner. The wireless Internet module 112 may receive weather information and road traffic state information (e.g., Transport Protocol Expert Group (TPEG) information) from the external server. The short-range communication module 113 may form wireless area networks to perform the short-range communication between the vehicle surround monitoring device 100 and at least one external device. For example, the short-range communication module 113 may exchange data with a mobile terminal of a passenger in a wireless manner.

The short-range communication module 113 may receive weather information and road traffic state information (e.g., Transport Protocol Expert Group (TPEG) information) from the mobile terminal or the external server. When a user gets into the vehicle surround monitoring device 100, the mobile terminal of the user and the vehicle surround monitoring device 100 may pair with each other automatically or as the user executes a pairing application.

The light emitting unit may include at least one light emitting element to convert electrical signals into light. Here, the light emitting element may be a Light Emitting Diode (LED). The light emitting unit converts electrical signals into light to thereby emit the light.

For example, the light emitting unit may externally emit light via flickering of the light emitting element corresponding to a prescribed frequency. In some embodiments, the light emitting unit may include an array of a plurality of light emitting elements. In some embodiments, the light emitting unit may be integrated with a lamp provided in the vehicle surround monitoring device 100. For example, the light emitting unit may be at least one selected from among a headlight, a taillight, a brake light, a turn signal light, and a sidelight. For example, the optical communication module 115 may exchange data with another vehicle via optical communication. The acceleration input unit 121c is configured to receive user input for the acceleration of the vehicle surround monitoring device 100.

The brake input unit 121d is configured to receive user input for the speed reduction of the vehicle surround monitoring device 100. Each of the acceleration input unit 121c and the brake input unit 121d may have a pedal form.

In some embodiments, the acceleration input unit 121c or the brake input unit 121 d may be configured as a touchscreen, a touch pad, or a button. The camera 122 may be located at one side of the space inside the vehicle surround monitoring device 100 to produce an image of the interior of the vehicle surround monitoring device 100. For example, the camera 122 may be located at various positions inside the vehicle surround monitoring device 100 such as, for example, the dashboard surface, the interior roof surface, and a rear view mirror, and may serve to capture an image of a passenger of the vehicle surround monitoring device 100. In this case, the camera 122 may produce an image of the passenger compartment including the driver's seat of the vehicle surround monitoring device 100.

In addition, the camera 122 may produce an image of the passenger compartment including the driver's seat and a passenger's seat of the vehicle surround monitoring device 100. The image of the passenger compartment produced by the camera 122 may include a 2-dimensional (2D) and/or 3-dimensional (3D) image. To produce the 3D image, the camera 122 may include at least one of a stereo camera, a depth camera, and a 3D laser scanner. The camera 122 may provide the image of the interior, produced thereby, to the controller 170, which is functionally coupled thereto.

The controller 170 may detect a variety of objects by analyzing the indoor image provided from the camera 122. For example, the controller 170 may detect the driver's eyes and/or gesture from a portion of the indoor image that corresponds to the area of the driver's seat. In another example, the controller 170 may detect the driver's eyes and/or gesture from a portion of the indoor image that corresponds to an indoor area excluding the area of the driver's seat. Of course, the driver's eyes and/or gestures and the passenger's eyes and/or gestures may be detected simultaneously. The user input unit 124 is configured to receive information from the user. When information is input via the user input unit 124, the controller 170 may control the operation of the vehicle surround monitoring device 100 so as to correspond to the input information.

The user input unit 124 may include a touch input unit or a mechanical input unit. In some embodiments, the user input unit 124 may be located in a region of the steering wheel. In this case, the driver may operate the user input unit 124 with the fingers while gripping the steering wheel.

The sensing unit 160 is configured to sense signals associated with, for example, the driving of the vehicle surround monitoring device 100. To this end, the sensing unit 160 may include a collision sensor, a steering sensor, a speed sensor, gradient sensor, a weight sensor, a heading sensor, a yaw sensor, a gyro sensor, a position module, a vehicle forward/reverse sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor based on the rotation of the steering wheel, a vehicle interior temperature sensor, a vehicle interior humidity sensor, an infrared sensor, a radar 162, a LiDAR 163, and an ultrasonic sensor 164. As such, the sensing unit 160 may acquire sensing signals with regard to, for example, vehicle collision information, vehicle traveling direction information, vehicle location information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward/reverse information, battery information, fuel information, tire information, vehicle lamp information, vehicle interior temperature information, vehicle interior humidity information, and steering wheel rotation angle information. In addition, the controller 170 may generate control signals for acceleration, speed reduction, and direction change of the vehicle surround monitoring device 100, for example, based on surrounding environment information acquired by at least one of the camera, the ultrasonic sensor, the infrared sensor, the radar, and the LiDAR included in the vehicle surround monitoring device 100. Here, the surrounding environment information may be information related to various objects located within a prescribed distance range from the vehicle surround monitoring device 100 that is traveling. For example, the surrounding environment information may include the number of obstacles located within a distance of 100 m from the vehicle surround monitoring device 100, the distances to the obstacles, the sizes of the obstacles, the kinds of the obstacles, and the like. The sensing unit 160 may include a biometric information sensing unit.

The biometric information sensing unit is configured to sense and acquire biometric information of the passenger. The biometric information may include fingerprint information, iris-scan information, retina-scan information, hand geometry information, facial recognition information, and voice recognition information. The biometric information sensing unit may include a sensor to sense biometric information of the passenger.

Here, the camera 122 and the microphone 123 may operate as sensors. The biometric information sensing unit may acquire hand geometry information and facial recognition information via the camera 122.

The sensing unit 160 may include at least one camera 161 configured to capture an image of the outside of the vehicle surround monitoring device 100. The camera 161 may be referred to as an external camera. For example, the sensing unit 160 may include a plurality of cameras 161 arranged at different positions at the exterior of the vehicle surround monitoring device 100. Each camera 161 may include an image sensor and an image processing module. The camera 161 may process a still image or moving image acquired by the image sensor (e.g., a CMOS or CCD). The image processing module may extract required information by processing the still image or moving image acquired by the image sensor, and may transmit the extracted information to the controller 170. The display unit 141 may display information processed in the controller 170.

For example, the display unit 141 may display vehicle associated information. Here, the vehicle associated information may include vehicle control information for the direct control of the vehicle surround monitoring device 100 or driver assistance information to guide the driver's vehicle driving. In addition, the vehicle associated information may include vehicle state information that indicates the current state of the vehicle or vehicle traveling information regarding the traveling of the vehicle. The display unit 141 may configure an inter-layer structure along with a touch sensor or may be integrally formed with the touch sensor, so as to implement a touchscreen. The touchscreen may function as the user input unit 124 which provides an input interface between the vehicle surround monitoring device 100 and the user, and also function to provide an output interface between the vehicle surround monitoring device 100 and the user. In this case, the display unit 141 may include a touch sensor which senses a touch to the display unit 141 so as to receive a control command in a touch manner. When a touch is input to the display unit 141 as described above, the touch sensor may sense the touch and the controller 170 may generate a control command corresponding to the touch.

Content input in a touch manner may be characters or numbers, or may be, for example, instructions in various modes or menu items that may be designated. For example, in the case where a fossil fuel based engine (not illustrated) is a power source, the power source drive unit 151 may perform electronic control for the engine. As such, the power source drive unit 151 may control, for example, an output torque of the engine. In the case where the power source drive unit 151 is the engine, the power source drive unit 151 may control the speed of the vehicle surround monitoring device 100 by controlling the output torque of the engine under the control of the controller 170.

The steering drive unit 152 may include a steering apparatus. Thus, the steering drive unit 152 may perform electronic control for a steering apparatus inside the vehicle surround monitoring device 100.

For example, the steering drive unit 152 may include a steering torque sensor, a steering angle sensor, and a steering motor. The steering torque, applied to the steering wheel by the driver, may be sensed by the steering torque sensor.

The steering drive unit 152 may control steering force and a steering angle by changing the magnitude and direction of current applied to the steering motor based on, for example, the speed and the steering torque of the vehicle surround monitoring device 100. In addition, the steering drive unit 152 may determine whether the direction of travel of the vehicle surround monitoring device 100 is correctly being adjusted based on steering angle information acquired by the steering angle sensor. As such, the steering drive unit 152 may change the direction of travel of the vehicle surround monitoring device 100.

In addition, the steering drive unit 152 may reduce the sense of weight of the steering wheel by increasing the steering force of the steering motor when the vehicle surround monitoring device 100 travels at a low speed and may increase the sense of weight of the steering wheel by reducing the steering force of the steering motor when the vehicle surround monitoring device 100 travels at a high speed. In addition, when the autonomous driving function of the vehicle surround monitoring device 100 is executed, the steering drive unit 152 may control the steering motor to generate appropriate steering force based on, for example, the sensing signals output from the sensing unit 160 or control signals provided by the controller 170 even in the state in which the driver operates the steering wheel (i.e. In the state in which no steering torque is sensed).

The brake drive unit 153 may perform electronic control of a brake apparatus (not illustrated) inside the vehicle surround monitoring device 100. For example, the brake drive unit 153 may reduce the speed of the vehicle surround monitoring device 100 by controlling the operation of brakes located at wheels. In another example, the brake drive unit 153 may adjust the direction of travel of the vehicle surround monitoring device 100 leftward or rightward by differentiating the operation of respective brakes located at left and right wheels.

The wiper drive unit 159 may perform the control of wipers 14a and 14b included in the vehicle surround monitoring device 100. For example, the wiper drive unit 159 may perform electronic control with regard to, for example, the number of operations and the speed of operation of the wipers 14a and 14b in response to user input upon receiving the user input that directs operation of the wipers 14a and 14b through the user input unit 124.

In another example, the wiper drive unit 159 may determine the amount or strength of rainwater based on sensing signals of a rain sensor included in the sensing unit 160 so as to automatically operate the wipers 14a and 14b without the user input. The memory 130 is electrically connected to the controller 170. The memory 130 may store basic data for each unit, control data for the operation control of the unit, and input/output data. The memory 130 may be any of various storage devices such as, for example, a ROM, a RAM, an EPROM, a flash drive, and a hard drive. The memory 130 may store various kinds of data for the overall operation of the vehicle surround monitoring device 100 such as, for example, programs for the processing or control of the controller 170.

The interface unit 180 may receive vehicle speed information, steering wheel rotation angle information, or gearshift information. The interface unit 180 may receive vehicle speed information, steering wheel rotation angle information, or gearshift information sensed via the sensing unit 160 of the vehicle surround monitoring device 100. Alternatively, the interface unit 180 may receive vehicle speed information, steering wheel rotation angle information, or gearshift information from the controller 170 of the vehicle surround monitoring device 100. Meanwhile, here, gearshift information may be information related to the current gear position of the vehicle surround monitoring device 100. For example, gearshift information may be information regarding whether the gearshift is in any one of Park (P), Reverse (R), Neutral (N), and Drive (D), or numbered gears. The controller 170 of the vehicle surround monitoring device 100 according to the embodiment of the present invention may serve to generate a surround-view image of the vehicle surround monitoring device 100 using the camera 161, to detect information from the generated surround-view image, and to output a control signal, which is required to execute any operation related to the vehicle surround monitoring device 100 based on the detected information, to the drive unit 150.

For example, the controller 170 may control a steering apparatus based on a control signal. Meanwhile, the full height H of the vehicle surround monitoring device 100 is the distance from the tire thread to the uppermost point of the vehicle body, and may be changed within a prescribed range according to, for example, the weight or position of passengers or luggage in the vehicle surround monitoring device 100. In addition, the lowermost point of the body of the vehicle surround monitoring device 100 and the road surface may be spaced apart from each other by the minimum ground clearance G.

This may prevent damage to the vehicle body due to any object having a lower height than the minimum ground clearance G. In addition, the distance between the front left and right tires 11 a and 11b and the distance between the rear left and right tires 11 c and 11d of the vehicle surround monitoring device 100 are assumed as being equal to each other. Hereinafter, the distance between the inner side of the front-wheel left tire 11a and the inner side of the front-wheel right tire 11b and the distance between the inner side of the rear-wheel left tire 11 c and the inner side of the rear-wheel right tire 11d are assumed as having the same value T. The camera 161 may acquire stereo images of the view in front of the vehicle surround monitoring device 100 from the first and second cameras 310 and 320. In addition, the camera 161 may perform binocular disparity detection based on the stereo images and then perform detection of at least one object (e.g., a pedestrian, a traffic light, a road, a traffic lane marker, and another vehicle) shown in at least one stereo image based on the binocular disparity information. After the object detection, the camera 161 may continuously track the movement of the object.

Referring to FIG. 3C, a composite image 400 may include a first image region 401 which corresponds to an external image captured by the front camera 161 a, a second image region 402 which corresponds to an external image captured by the left camera 161b, a third image region 403 which corresponds to an external image captured by the right camera 161c, and a fourth image region 404 which corresponds to an external image captured by the rear camera 161d. The composite image 400 may be called an around view monitoring image. The radar 162 may be mounted at one side of the vehicle surround monitoring device 100, and serve to emit electromagnetic waves to the vicinity of the vehicle surround monitoring device 100 and to receive the electromagnetic waves reflected from a variety of objects that are present in the vicinity of the vehicle surround monitoring device 100. For example, the radar 162 may acquire information related to, for example, the distance, direction, and height of any one object by measuring the time taken until the electromagnetic waves reflected by the corresponding object return thereto. The LiDAR 163 may be mounted at one side of the vehicle surround monitoring device 100, and serve to emit laser light in the vicinity of the vehicle surround monitoring device 100. The laser light, emitted by the LiDAR 163, may be scattered or reflected to thereby return to the vehicle surround monitoring device 100.

The LiDAR 163 may acquire information related to physical properties such as, for example, the distance, speed, and shape of a target, which is located in the vicinity of the vehicle surround monitoring device 100, based on the time taken until the laser returns, the strength of the laser light, variation in frequency, and variation in polarization. The object tracking unit 540 may track the verified object. For example, the object tracking unit 540 may sequentially verify an object in the acquired stereo images, calculate motion or a motion vector of the verified object, and track movement of the object based on the calculated motion or the calculated motion vector. Consequently, the object tracking unit 540 may track, for example, an adjacent vehicle, a traffic lane marker, road surface, a traffic sign, a dangerous area, and a tunnel located around the vehicle surround monitoring device 100. Specifically, at least one of the camera 161, the radar 162, the LiDAR 163, and the ultrasonic sensor 164, included in the sensing unit 160, may detect objects that are present in the environment surrounding the vehicle surround monitoring device 100, and may provide the controller 170 with sensing information related to the detected objects. For example, the sensing information provided from the sensing unit 160 may include a 3D image of the surrounding environment captured by the camera 161. Objects that may be detected by the sensing unit 160 may include, for example, other vehicles, motorcycles, bicycles, pedestrians, fallen objects, animals, buildings, traffic lights, road signs, and traffic lane markers.

For example, the controller 170 may classify, among the objects present in the surrounding environment, an object that is moving at a prescribed speed or more (e.g. Lm/s) as a moving object, and an object that is moving at a speed below the prescribed speed (e.g. 1 m/s) or is stationary as a stationary object. For example, when another vehicle in the vicinity of the vehicle surround monitoring device 100 is stationary, the vehicle surround monitoring device 100 may classify the other vehicle as a stationary object. The controller 170 may determine the type of the moving object based on the sensing information. For example, the controller 170 may determine the type of the detected moving object by extracting the contour of the moving object from the 3D image included in the sensing information to check the external appearance of the moving object, and comparing the checked external appearance with prestored templates. In another example, the controller 170 may determine the type of the moving object using, for example, a Histogram of Oriented Gradient (HOG) method.

In addition, the controller 170 may predict the future movement direction of the moving object based on variation in at least one of the position, speed, and direction of a plurality of points on the moving object. For example, when the type of the moving object is a bicycle, the controller 170 may simultaneously track any one point located in a wheel region of the bicycle, any one point located in a handle region, and any one point located in the region of the head of the bicycle rider, and may predict the movement speed and direction of the bicycle during the next second based on variation in at least one of the position, speed, and movement direction of the tracked points. The respective sub areas may indicate ranges within which the moving object can move during different time periods.

For example, any one of the sub areas may indicate a range within which the moving object can move during a period of one second from the present, and another of the sub areas may indicate a range within which the moving object can move during subsequent one second. The controller 170 may set or change at least one of the shape and size of each sub area based on at least one of the type, motion characteristics, and predicted movement direction of the moving object. In addition, the controller 170 may set or change a dangerous area for the moving object based on the illumination outside the vehicle surround monitoring device 100. For example, assuming that other conditions are the same, the controller 170 may increase the size of the dangerous area for the moving object as the illumination outside the vehicle surround monitoring device 100 is reduced. This is because the driver of the vehicle surround monitoring device 100 takes a relatively long time to recognize the risk of a collision with a moving object in a dark environment. Meanwhile, the controller 170 may set a dangerous area for the moving object only when the type of the moving object corresponds to a type that is predesignated by the user.

Specifically, the user can predesignate only a desired type for which the user wishes to receive guidance. For example, the user may select some of a plurality of types via the input unit 120 illustrated in FIG.

The controller 170 may not set a dangerous area for the moving object when the type of the moving object detected by the sensing unit 160 does not correspond to the predesignated type. In one embodiment, the controller 170 may predict the route along which the vehicle surround monitoring device 100 will travel within a prescribed time based on the speed and movement direction of the vehicle surround monitoring device 100 provided from the sensing unit 160.

For example, when the vehicle surround monitoring device 100 is driving straight at 10 m/s, the controller 170 may predict that the vehicle surround monitoring device 100 will travel through the area of the road from the current position to 30 m ahead within the next three seconds. In one embodiment, the vehicle surround monitoring device 100 may display, on the display unit 141, at least one of the image indicating the dangerous area and the image indicating the predicted route in an augmented-reality mode or a top-view mode. For example, the vehicle surround monitoring device 100 may select the augmented-reality mode or the top-view mode in response to user input, and may display at least one of the image which indicates the dangerous area and the image which indicates the predicted route in the selected mode.

In one embodiment, the vehicle surround monitoring device 100 may display, in the augmented-reality mode, the image indicating the dangerous area at a position close to the actual position of the moving object, which is checked through the windshield of the vehicle surround monitoring device 100. For example, the vehicle surround monitoring device 100 may display the image indicating the dangerous area on the windshield via a head-up display (see reference numeral 141b in FIG. 14B) or a transparent display (see reference numeral 141 c in FIG.

Meanwhile, the controller 170 may adjust, in the top-view mode, the scale of the map based on congestion around the vehicle surround monitoring device 100. In one embodiment, the controller 170 may estimate the total number of objects located in the environment surrounding the vehicle surround monitoring device 100 based on the sensing signal provided from the sensing unit 160, and may calculate congestion around the vehicle surround monitoring device 100 based on the estimated total number of objects. For example, the controller 170 may calculate the congestion around the vehicle surround monitoring device 100 in proportion to the total number of the objects located in the environment surrounding the vehicle surround monitoring device 100.

For example, the controller 170 may reduce the scale of the map displayed on the display unit 141 in the top-view mode as the congestion around the vehicle surround monitoring device 100 is higher, i.e. As the total number of objects in the vicinity of the vehicle surround monitoring device 100 is greater. Conversely, the controller 170 may increase the scale of the map displayed on the display unit 141 in the top-view mode as the congestion around the vehicle surround monitoring device 100 is lower, i.e. As the total number of objects in the vicinity of the vehicle surround monitoring device 100 is smaller. In this way, since the scale of the map is reduced in congested areas (e.g.

An intersection during afternoon rush hour), the driver of the vehicle surround monitoring device 100 may more easily check moving objects that are closer to the vehicle surround monitoring device 100 than others objects. Upon judging that a portion of the dangerous area set for the moving object overlaps the predicted route of the vehicle surround monitoring device 100, Step S770 may be performed. On the other hand, upon judging that no portion of dangerous area set for the moving object overlaps the predicted route of the vehicle surround monitoring device 100, the controller 170 may determine that the risk of a collision between the vehicle surround monitoring device 100 and the moving object is zero or is very low, thereby ending the process S700. In Step S770, the vehicle surround monitoring device 100 may execute a function corresponding to the portion of the dangerous area set for the moving object that overlaps the predicted route of the vehicle surround monitoring device 100. Specifically, the vehicle surround monitoring device 100 may execute at least one of a plurality of predetermined functions based on the sub area that overlaps the predicted route of the vehicle surround monitoring device 100.

For example, the predetermined functions may include at least one of (i) output of a visual or auditory warning to the driver of the vehicle surround monitoring device 100, (ii) control of the speed-reduction apparatus of the vehicle surround monitoring device 100, (iii) control of the steering apparatus of the vehicle surround monitoring device 100, and (iv) control of the lighting apparatus of the vehicle surround monitoring device 100. The predetermined functions may be set to defaults, or may be set in response to user input. For example, in the data table, any one sub area may be associated with a first function, and another sub area may be associated with a second function. In another example, in the data table, any one sub area may be associated with a first function, and another sub area may be associated with the first function and a second function.

When one of the sub areas associated with the first function overlaps the predicted route of the vehicle surround monitoring device 100, the controller 170 may execute the first function based on the data table. On the other hand, when one of the sub areas associated with the second function overlaps the predicted route of the vehicle surround monitoring device 100, the controller 170 may execute the second function based on the data table. Meanwhile, the controller 170 may adjust, based on the distance between the vehicle surround monitoring device 100 and the sub area that overlaps the predicted route of the vehicle surround monitoring device 100, control parameters (e.g. Braking force, steering angle, the volume of a warning sound, the size or brightness of a warning message, the strength of light beams output from headlights, and the volume of a horn) with regard to the function corresponding to the sub area that overlaps the predicted route of the vehicle surround monitoring device 100. For example, when the function of braking the vehicle surround monitoring device 100 (e.g.

A braking assistance function or an emergency braking function) corresponding to any one sub area is executed, the vehicle surround monitoring device 100 may generate greatetr braking force as the distance between the vehicle surround monitoring device 100 and the corresponding sub area is reduced. In another example, when the function of automatically steering the vehicle surround monitoring device 100 corresponding to another sub area is executed, the vehicle surround monitoring device 100 may set a greater steering angle as the distance between the vehicle surround monitoring device 100 and the corresponding sub area is reduced. The memory 130 may previously store the maximum speed, the maximum acceleration, and the minimum turning radius on a per-moving-object-type basis, and the controller 170 may set a dangerous area for each moving object based on the maximum speed and the maximum acceleration as well as the motion characteristics and the type of the moving object. For example, the maximum speed and the maximum acceleration on a per-object-type basis may be lowered in the sequence of the other vehicle 801, the motorcycle 802, the bicycle 803, and the pedestrian 804.

In another example, the minimum turning radius on a per-object-type basis may be reduced in the sequence of the other vehicle 801, the motorcycle 802, the bicycle 803, and the pedestrian 804. As illustrated, even if the other vehicle 801, the motorcycle 802, the bicycle 803, and the pedestrian 804 have the same speed (e.g. 5 m/s), the controller 170 may set, based on the maximum speed and the maximum acceleration on a per-object-type basis stored in the memory 130, the longest dangerous area 810 for the other vehicle 801, the second longest dangerous area 820 for the motorcycle 802, the third longest dangerous area 830 for the bicycle 803, and the shortest dangerous area 840 for the pedestrian 804.

Next, referring to FIG. 8B, the controller 170 may divide the dangerous area for each moving object into two or more sub areas. That is, the dangerous area may include two or more sub areas. As exemplarily illustrated in FIG. 8B, each of the four dangerous areas 810, 820, 830 and 840 may include three sub areas. Specifically, the dangerous area 810 for the other vehicle 801 may include first to third sub areas 811 to 813. In addition, the dangerous area 820 for the motorcycle 802 may include first to third sub areas 821 to 823.

In addition, the dangerous area 830 for the bicycle 803 may include first to third sub areas 831 to 833. In addition, the dangerous area 840 for the pedestrian 804 may include first to third sub areas 841 to 843. At this time, the respective sub areas for each moving object may indicate the distances by which the respective moving objects can move during different time periods.

For example, in the case of the other vehicle 801, the sub area 813 may indicate the predicted distance, by which the other vehicle 801 will travel at 5 m/s during the next second, the sub area 812 may indicate the predicted distance that the other vehicle 801 can travel at 5 m/s during a one-second interval starting one second from the present, and the sub area 811 may indicate the predicted distance that the other vehicle 801 can travel at 5 m/s during the second after that. In another example, the sub area 823 for the motorcycle 802 may indicate the predicated distance that the motorcycle 802 can travel at 5 m/s during the next second. Even if the other vehicle 801, the motorcycle 802, the bicycle 803, and the pedestrian 804 have the same speed (e.g. 5 m/s), the controller 170 may set, based on the maximum speed and the maximum acceleration on a per-object-type basis stored in the memory 130, the longest dangerous area 910 for the other vehicle 801, the second longest dangerous area 920 for the motorcycle 802, the third longest dangerous area 930 for the bicycle 803, and the shortest dangerous area 940 for the pedestrian 804. Even if the other vehicle 801, the motorcycle 802, the bicycle 803, and the pedestrian 804 have the same speed (e.g.

5 m/s) and the same movement direction (e.g. The X-axis), the controller 170 may set, based on the maximum speed, the maximum acceleration and the minimum turning radius on a per-object-type basis stored in the memory 130, the widest dangerous area 1010 for the other vehicle 801, the second widest dangerous area 1020 for the motorcycle 802, the third widest dangerous area 1030 for the bicycle 803, and the narrowest dangerous area 1040 for the pedestrian 804. At this time, the respective sub areas for each moving object may indicate ranges within which the moving object can move during different time periods.

For example, in the case of the other vehicle 801, the sub area 1013 may indicate the predicted range, within which the other vehicle 801 will travel at 5 m/s during the next second, the sub area 1012 may indicate the predicted range within which the other vehicle 801 will travel at 5 m/s during a one-second interval starting one second from the present, and the sub area 1011 may indicate the predicted range within which the other vehicle 801 will travel at 5 m/s during the second after that. 12A illustrates the case where the moving object is the bicycle 803. The controller 170 may detect and track a plurality of points P1, P2 and P3 on the bicycle 803 and a rider 1201, and may calculate variation over time of at least one of the position, speed and movement direction of each of the points P1, P2 and P3.

For example, the point P1 may be one point on the head of the rider 1201 of the bicycle 803, the point P2 may be one point on the handle of the bicycle 803, and the point P3 may be one point on the wheel of the bicycle 803. As illustrated, the controller 170 may acquire motion vectors V1, V2 and V3 for the respective points P1, P2 and P3 based on the position, speed and movement direction of the points P1, P2 and P3. The motion vector V1 represents the position, movement direction and speed of point P1, the motion vector V2 represents the position, movement direction and speed of point P2, and the motion vector V3 represents the position, movement direction and speed of point P3. The controller 170 may set a dangerous area 1200 including a plurality of sub areas 1201, 1202 and 1203 based on the motion characteristics of the bicycle 803 as well as the acquired motion vectors V1, V2 and V3.

12B illustrates a dangerous area 1210 in the case where the motion vectors V1, V2 and V3 of the points P1, P2 and P3 illustrated in FIG. 12A are changed to motion vectors V1', V2' and V3'.

The controller 170 may change the size and shape of the dangerous area 1200 into those of the dangerous area 1210 based on the difference between the motion vectors V1, V2 and V3 and the motion vectors V1', V2' and V3'. Specifically, the sub area 1201 may be changed into a sub area 1211, the sub area 1202 may be changed into a sub area 1212, and the sub area 1203 may be changed into a sub area 1213. When the navigation display 141 a is a touchscreen, the user of the vehicle surround monitoring device 100 may select the type of moving object for which the user wishes to receive guidance by touching at least one of the selection menu items 1311, 1312, 1313 and 1314. For example, when the user touches the selection menu item 1311, the vehicle surround monitoring device 100 may set dangerous areas for external objects that correspond to other vehicles, excluding motorcycles, bicycles, and pedestrians. The vehicle surround monitoring device 100 may display the image 1410 indicating a dangerous area for the other vehicle 1401 and the image 1420 indicating the predicted route of the vehicle surround monitoring device 100 on the windshield of the vehicle surround monitoring device 100 via the head-up display 141b or the transparent display 141c. Specifically, as exemplarily illustrated in FIG. 14B, the driver of the vehicle surround monitoring device 100 may check the other vehicle 1401 with the eyes through the windshield.

The other vehicle 1401 is moving from the right side to the left side of the intersection, and the image 1410 indicating the dangerous area may be displayed as extending leftward along the Y-axis from the front side of the other vehicle 1401, which is seen through the windshield. Specifically, the controller 170 may display an indicator 1431, indicating the location of the vehicle surround monitoring device 100, on a map 1430 stored in the memory 130 in a top-view mode. In addition, the controller 170 may perform mapping of an image 1432 indicating the predicted route of the vehicle surround monitoring device 100 on the map 1430.

For example, as the image indicating the predicted route of the vehicle surround monitoring device 100, the indicator 1432 may be included in the map 1430. At this time, since the vehicle surround monitoring device 100 is moving forward along the X-axis, the indicator 1432 may be displayed in front of the indicator 1431. The controller 170 may calculate congestion based on the total number of moving objects detected by the sensing unit 160.

For example, the controller 170 may calculate a first congestion value when one moving object is detected at a specific point in time, and may calculate a second congestion value, which is greater than the first value, when two or more moving objects are detected at a specific point in time. In addition, the controller 170 may adjust the scale of the map such that the number of moving objects to be displayed on the map corresponds to the calculated congestion value. Referring to FIG. 15B, the map 1540 may include an indicator 1541 indicating the location of the vehicle surround monitoring device 100. In addition, the controller 170 may map an image, indicating the predicted route of the vehicle surround monitoring device 100, to the map 1540. For example, as the image indicating the predicted route of the vehicle surround monitoring device 100, an indicator 1542 may be included in the map 1540. At this time, since the vehicle surround monitoring device 100 is moving forward along the X-axis, the indicator 1542 may be displayed in front of the indicator 1541.

Meanwhile, although not illustrated, the vehicle surround monitoring device 100 may display, on the map, only indicators for moving objects corresponding to types selected via the user interface 1310 illustrated in FIG. For example, when only the selection menu item 1313 for designating bicycles is selected from the user interface 1310, the controller 170 may not display the indicator 1545 indicating the location of the pedestrian 1512 and the indicator 1546 indicating a dangerous area for the pedestrian 1512 on the map 1540. For example, when the predicted route of the vehicle surround monitoring device 100 overlaps a first sub area among the three sub areas, the controller 170 may select a warning output function corresponding to the first sub area. When the warning output function is selected, the controller 170 may output auditory feedback (e.g.

Warning sound) via the sound output unit 142 or visual feedback (e.g. An image indicating danger) via the display unit 141 to the driver of the vehicle surround monitoring device 100. In a further example, when the predicted route of the vehicle surround monitoring device 100 overlaps a third sub area among three sub areas, the controller 170 may select a warning output function, an emergency braking function, and an emergency steering function corresponding to the third sub area from the data table 1610. When the emergency steering function is selected, the controller 170 may change the direction of travel of the vehicle surround monitoring device 100 so as to reduce the risk of a collision with a moving object via the steering drive unit 152. For example, because the first other vehicle, indicated by the indicator 1720, is moving away from the vehicle surround monitoring device 100, the controller 170 may determine that the risk of a collision between the first other vehicle indicated by the indicator 1720 and the vehicle surround monitoring device 100 is zero or very low, and thus not display any dangerous area for the first other vehicle indicated by the indicator 1720 on a map 1700.

On the other hand, because the distance between the second other vehicle indicated by the indicator 1730 and the vehicle surround monitoring device 100 is decreasing, the controller 170 may determine that there is the potential risk of a collision between the second other vehicle indicated by the indicator 1730 and the vehicle surround monitoring device 100, and thus not display any dangerous area 1740 for the second other vehicle indicated by the indicator 1730 on the map 1700. Specifically, the dangerous area 1740 may include a plurality of sub areas 1741, 1742 and 1743.

The size and shape of the respective sub areas 1741, 1742 and 1743 may be determined based on the speed and movement direction of the other vehicle indicated by the indicator 1730. For example, since the second other vehicle indicated by the indicator 1730 is moving forward, all of the sub areas 1741, 1742 and 1743 may be displayed in front of the indicator 1730.

In addition, each of the sub areas 1741, 1742 and 1743 may have a length corresponding to the speed of the second other vehicle indicated by the indicator 1730. Meanwhile, the vehicle surround monitoring device 100 may display the predicted route 1711 of the vehicle surround monitoring device 100 on the map 1700. The vehicle surround monitoring device 100 may determine the length and shape of an icon 1711, indicating the predicted route to be displayed on the map 1700, based on the speed and the movement direction of the vehicle surround monitoring device 100 and the route to a destination.

For example, when the vehicle surround monitoring device 100 is moving forward, the icon 1711 indicating the predicted route of the vehicle surround monitoring device 100 may be represented by the arrow corresponding to the forward movement direction, as illustrated in FIG. As the moving object and the vehicle surround monitoring device 100 move along the road, the motion characteristics of the moving object and the motion characteristics of the vehicle surround monitoring device 100 may be changed.

The vehicle surround monitoring device 100 may change the map to be displayed on the display unit 141, either in real time or periodically, based on variation in at least one of the motion characteristics of the moving object and the motion characteristics of the vehicle surround monitoring device 100. In addition, as the vehicle surround monitoring device 100 comes closer to the intersection, the icon 1711, indicating the predicted route of the vehicle surround monitoring device 100, may change from the straight shape illustrated in FIG. 17 into the leftward bent shape illustrated in FIG. That is, the icon 1711 illustrated in FIG. 18 may indicate that the vehicle surround monitoring device 100 is going to turn left. At this time, the controller 170 may determine the length of the icon 1711 indicating the predicted route of the vehicle surround monitoring device 100 based on the speed and movement direction of the vehicle surround monitoring device 100 and the route to a destination. The controller 170 may access the memory 130, and may select a function corresponding to the sub area 1741, which overlaps the predicted route of the vehicle surround monitoring device 100, from the data table 1610 illustrated in FIG.

Specifically, as exemplarily illustrated in FIG. 18, when the predicted route of the vehicle surround monitoring device 100 overlaps the first sub area 1741 of the second other vehicle, the controller 170 may select and execute a warning output function with reference to the data table 1610. As the warning output function is executed, the vehicle surround monitoring device 100 may provide the driver with visual feedback which indicates the risk of a collision between the second other vehicle and the vehicle surround monitoring device 100.

At this time, the controller 170 may predict the time remaining until the collision between the vehicle surround monitoring device 100 and the second other vehicle based on, for example, the motion characteristics of the vehicle surround monitoring device 100, the motion characteristics of the second other vehicle, and the actual distance between the first sub area 1741 and the vehicle surround monitoring device 100, and then provide the driver with visual feedback that includes the predicted time. In one example, a message 1810, which suggests the action that is required of the driver and the type of collision that is expected (e.g. 'A head-on collision with the other vehicle is expected in 3 seconds. Reduce your speed'), may be displayed in one region of the map 1800. In addition, a warning icon 1820 may be displayed in the vicinity of the indicator 1730 indicating the second other vehicle.

The controller 170 may select a function corresponding to the sub area 1742, which overlaps the predicted route of the vehicle surround monitoring device 100, from the data table 1610 illustrated in FIG. Specifically, as illustrated in FIG. 19, when the predicted route 1711 of the vehicle surround monitoring device 100 and the second sub area 1742 of the second other vehicle overlap each other, the controller 170 may select and execute a warning output function and an emergency braking function with reference to the data table 1610. The execution of the warning output function has been described above with reference to FIG.

18, and a detailed description thereof will be omitted. As described above, the vehicle surround monitoring device 100 may adjust the length of the indicator 1711 indicating the predicted route of the vehicle surround monitoring device 100 based on the speed of the vehicle surround monitoring device 100. As the speed of the vehicle surround monitoring device 100 is reduced via the execution of the emergency braking function, the distance that the vehicle surround monitoring device 100 can travel during the same time may be reduced. Therefore, the length of the indicator 1711 may be reduced by an amount proportional to the reduction in the speed of the vehicle surround monitoring device 100.

The controller 170 may access the memory 130, and may select a function corresponding to the sub area 1743 which overlaps the predicted route of the vehicle surround monitoring device 100 from the data table 1610 illustrated in FIG. Specifically, as illustrated in FIG. 21, when the predicted route 1711 of the vehicle surround monitoring device 100 overlaps the third sub area 1743 of the second other vehicle, the controller 170 may select and execute a warning output function, an emergency braking function, and an emergency steering function with reference to the data table 1610. The warning output function and the emergency braking function have been described above with reference to FIGs. 18 and 19, and a detailed description thereof will be omitted below.

When the emergency steering function is executed, the controller 170 may display a message 2110, indicating that the emergency steering function is executed (e.g. 'Executing emergency braking function and emergency steering function'), in one region of the map 2100. Simultaneously, the controller 170 may calculate the steering angle of the vehicle surround monitoring device 100 that is required in order to avoid a collision between the vehicle surround monitoring device 100 and the second other vehicle based on the motion characteristics of the vehicle surround monitoring device 100, the motion characteristics of the second other vehicle, and the distance between the vehicle surround monitoring device 100 and the third sub area 743. The controller 170 may provide the calculated steering angle information to the steering drive unit 152, so as to change the movement direction of the vehicle surround monitoring device 100. Referring to FIG. 22, as the speed of the vehicle surround monitoring device 100 is reduced via the execution of the emergency braking function, the length of the indicator 1711, indicating the predicted route of the vehicle surround monitoring device 100, may be shorter than that illustrated in FIG. In addition, as the movement direction of the vehicle surround monitoring device 100 is changed via the execution of the emergency steering function, the indicator 1711, indicating the predicted route of the vehicle surround monitoring device 100, may provide guidance for the road at the upper side of the intersection, rather than the road at the left side of the intersection.

Pink Floyd Ummagumma 2011 RARE. In addition, the vehicle surround monitoring device 100 may display, in one region of the map 2200, a message 2210 indicating that the emergency braking function and the emergency steering function have been deactivated and that a new route has been set for the vehicle surround monitoring device 100 (e.g. 'Emergency braking function and emergency steering function have been released.

A new route has been set'). For example, the indicator 1711 illustrated in FIG. 21 provides guidance indicating that the vehicle surround monitoring device 100 is going to turn left, whereas the indicator 1711 illustrated in FIG. 22 provides guidance indicating that the vehicle is going to go straight. That is, when the movement direction of the vehicle surround monitoring device 100 is rapidly changed via the execution of the emergency steering function, or when rapid variation in the movement direction of the vehicle surround monitoring device 100 is expected, the vehicle surround monitoring device 100 may automatically cancel a previously found route and provide guidance for a new route to the driver based on the calculated steering angle. Referring to FIG. 23, an indicator 2310 indicating the location of the vehicle surround monitoring device 100 and an indicator 2330 indicating the location of another vehicle may be displayed on a map 2300.

When the vehicle surround monitoring device 100 and the other vehicle are moving into an intersection from different directions, as illustrated, the vehicle surround monitoring device 100 may display an indicator 2320 indicating the predicted route of the vehicle surround monitoring device 100 in front of the indicator 2310, and may display an indicator 2340 indicating a dangerous area for the other vehicle in front of the indicator 2330. Meanwhile, the indicator 2320, indicating the predicted route of the vehicle surround monitoring device 100, overlaps the indicator 2340 indicating the dangerous area for the other vehicle, the controller 170 may execute a function that corresponds to the portion of the dangerous area for the other vehicle which overlaps the predicted route of the vehicle surround monitoring device 100. As illustrated, when the indicator 2320 indicating the predicted route of the vehicle surround monitoring device 100 overlaps a second sub area 2342 for the other vehicle, the controller 170 may select and execute the emergency braking function with reference to the data table 1610. In addition, the controller 170 may display a message 2350 indicating the execution of the emergency braking function in one region of the map 2300. Meanwhile, when the emergency braking function is executed, the vehicle surround monitoring device 100 may determine the braking force to be applied to the vehicle surround monitoring device 100 based on the distance between the vehicle surround monitoring device 100 and the second sub area 2342. For example, as illustrated, when the distance between the vehicle surround monitoring device 100 and the second sub area 2342 is a first distance L1, the controller 170 may control the brake drive unit 153 so as to generate braking force corresponding to the first distance L1.

In addition, the controller 170 may display a message 2351, indicating that the braking force (e.g. 'LEVEL 2') corresponding to the first distance L1 is applied to the vehicle surround monitoring device 100, in one region of the map 2300. Meanwhile, the distance L2 between the vehicle surround monitoring device 100 and a second sub area 2442 may become shorter than the distance L1 illustrated in FIG. 23 as the vehicle surround monitoring device 100 moves forward. Freud Instincts And Their Vicissitudes Pdf Creator. For example, as illustrated, when the distance between the vehicle surround monitoring device 100 and the second sub area 2442 is reduced from the first distance L1 to the second distance L2, the controller 170 may control the brake drive unit 153 so as to generate braking force corresponding to the second distance L2.

In this case, since the second distance L2 is shorter than the first distance L1, and thus the risk of a collision between the vehicle surround monitoring device 100 and the other vehicle is relatively high, the braking force corresponding to the second distance L2 may be greater than the braking force corresponding to the first distance L1. That is, the braking force to be applied to the vehicle surround monitoring device 100 may increase as the distance between the vehicle surround monitoring device 100 and the second sub area 2442 is reduced. According to FIGs. 23 and 24, the vehicle surround monitoring device 100 may adjust a control parameter related to at least one function among predetermined functions based on, for example, the distance between the vehicle surround monitoring device 100 and the dangerous area even if the predicted route of the vehicle surround monitoring device 100 overlaps the same portion of the dangerous area for the moving object. Thereby, it is possible to more positively reduce the risk of a collision between the vehicle surround monitoring device 100 and the moving object in the vicinity of the vehicle surround monitoring device 100. Meanwhile, although the description of FIGs. 23 and 24 is based on the emergency braking function, this is given merely by way of example and a similar method may be applied to other functions.

For example, when the predicted route 2320 of the vehicle surround monitoring device 100 overlaps the first sub area 2341, the controller 170 may increase the magnitude of the auditory feedback (e.g. Sound volume) as the distance between the vehicle surround monitoring device 100 and the first sub area 2341 is reduced. In another example, when the predicted route 2320 of the vehicle surround monitoring device 100 overlaps the third sub area 2343, the controller 170 may increase the steering angle in order to avoid a collision with the other vehicle as the distance between the vehicle surround monitoring device 100 and the third sub area 2343 is reduced. First, referring to FIG. 25A, the head-up display 141b and the transparent display 141c may be mounted in the space inside the vehicle surround monitoring device 100.

The driver of the vehicle surround monitoring device 100 may visually check a bicycle 2510 and another vehicle 2520 through the windshield. For example, the bicycle 2510 may be moving forward on the sidewalk at the right side of a lane in which the vehicle surround monitoring device 100 is located, and the other vehicle 2520 may be moving forward in the lane at the left side of the vehicle surround monitoring device 100. The vehicle surround monitoring device 100 may detect the bicycle 2510 and the other vehicle 2520 using the sensing unit 160. In addition, the vehicle surround monitoring device 100 may set a dangerous area for only the bicycle 2510, which is a moving object corresponding to the selection menu item 1313, among the detected bicycle 2510 and the detected other vehicle 2520, and display an indicator 2511 indicating the set dangerous area on the windshield. For example, the controller 170 may display the indicator 2511 on any one of the head-up display 141b and the transparent display 141c, i.e.

On the windshield. In addition, as illustrated, the controller 170 may display an indicator 2530, indicating the predicted route of the vehicle surround monitoring device 100, along with the indicator 2511. Since the indicator 2530, indicating the predicted route of the vehicle surround monitoring device 100, and the indicator 2511, indicating the dangerous area for the bicycle 2510, are simultaneously displayed in an augmented-reality mode on the windshield, the driver of the vehicle surround monitoring device 100 may intuitively recognize the motion characteristics (e.g. Location, speed, and movement direction) of the vehicle surround monitoring device 100 and the bicycle 2510 and the distance between the vehicle surround monitoring device 100 and the bicycle 2510. Meanwhile, since there is no overlapping portion between the indicator 2511 and the indicator 2530, the vehicle surround monitoring device 100 may not execute any one of predetermined functions in order to avoid a collision between the vehicle surround monitoring device 100 and the bicycle 2510.

In one embodiment, the controller 170 may track a plurality of points on the bicycle 2510, and adjust the size and shape of the indicator 2511 based on variation in the position, speed and direction of the tracked points. For example, as exemplarily illustrated in FIG. 25A, when the bicycle 2510, which is moving forward on the sideway, increases its speed in order to enter the lane in which the vehicle surround monitoring device 100 is located as illustrated in FIG. 25B, the controller 170 may adjust the direction of the indicator 2511 so as to point toward the lane, and may simultaneously increase the length of the indicator 2511. In a further example, the communication unit 110 may receive information regarding moving objects collected by a Roadside Unit (RSU) 2603 by performing Vehicle-to-Infrastructure (V2I) communication with the RSU 2603. Here, the RSU is, for example, a base station installed at a specific location alongside the road. In addition, the RSU 2603 may receive, from the vehicle surround monitoring device 100, information regarding moving objects detected by the vehicle surround monitoring device 100, and may provide the received information to the other objects 2601 and 2602 in the vicinity of the vehicle surround monitoring device 100.

The vehicle surround monitoring device 100 may set or change a dangerous area for a moving object based on the received information regarding the moving object via Vehicle-to-Everything (V2X) communication. For example, the controller 170 may set the dangerous area for the moving object based on the type and motion characteristics of the moving object determined based on a sensing signal from the sensing unit 160, together with the information regarding the moving object received via V2X communication.

In another example, when the moving object is hidden by, for example, a building, and thus it is impossible to detect the moving object using the sensing unit 160, the controller 170 may set a dangerous area for the corresponding moving object based only on the information regarding the moving object received via V2X communication. All of the vehicle surround monitoring device 100, the first other vehicle 2701, and the second other vehicle 2702 are moving forward, and the vehicle surround monitoring device 100 may determine the speed and movement direction of the first other vehicle 2701 and the second other vehicle 2702 in the vicinity of the intersection 2700. For example, the vehicle surround monitoring device 100 may determine that the first other vehicle 2701 is approaching the intersection 2700 at a first speed and that the second other vehicle 2702 has almost passed through the intersection 2700 at a second speed.

The vehicle surround monitoring device 100 may determine that there is no risk of a collision between the vehicle surround monitoring device 100 and the second other vehicle 2702, which has almost passed through the intersection 2700, among the first other vehicle 2701 and the second other vehicle 2702, thereby judging that there is only the risk of a collision between the vehicle surround monitoring device 100 and the first other vehicle 2701. The vehicle surround monitoring device 100 may determine whether the outside illuminance at the point in time at which the first other vehicle 2701 is detected is a predetermined threshold illuminance or more. For example, the illuminance outside the vehicle surround monitoring device 100 may be sensed by an illuminance sensor included in the sensing unit 160.

Here, information regarding the threshold illuminance may be prestored in the memory 130. For example, the threshold illuminance may be set based on, for example, the average illuminance difference between day and night or the average illuminance difference between the inside and the outside of a tunnel. The threshold illuminance may be changed through user input.

On the other hand, upon judging that the illuminance outside the vehicle surround monitoring device 100 is below the predetermined threshold illuminance, the vehicle surround monitoring device 100 may emit light 2720 in the direction of the first other vehicle 2701, rather than outputting the horn sound 2710, or together with the output of the horn sound 2710. For example, the controller 170 may determine the target rotation angle of a motor (e.g.

A step motor) provided at the headlights of the vehicle surround monitoring device 100 based on the location of the vehicle surround monitoring device 100, the location of the first other vehicle 2701, and the distance between the vehicle surround monitoring device 100 and the first other vehicle 2701, and then may adjust the positions of the headlights so as to correspond to the determined target rotation angle, and thereafter may control the lamp drive unit 154 so as to emit beams of light toward the first other vehicle 2701. At this time, the controller 170 may control the lamp drive unit 154 so that the emitted light beams have a greater intensity as the distance between the vehicle surround monitoring device 100 and the first other vehicle 2701 is reduced. In addition, with at least one of the embodiments of the present invention, when a moving object is hidden by, for example, an obstacle, and thus the vehicle cannot directly detect the moving object, the vehicle may receive information related to the motion of the corresponding moving object via vehicle-to-everything (V2X) communication and set a dangerous area for the corresponding moving object based on the received information. In this way, even if the moving object is invisible to the driver of the vehicle or cannot be directly detected by the vehicle, the vehicle is capable of previously checking the risk of a collision with the corresponding moving object. In addition, it should be readily understood that the invention is not limited to the embodiments described above and the accompanying drawings.

Rather, the invention can be modified to incorporate any number of variations, alterations, substitutions, or equivalent arrangements not heretofore described, but which are commensurate with the spirit and scope of the invention. Accordingly, the invention is not to be seen as limited by the foregoing description of the embodiments and the accompanying drawings, and some or all of the embodiments may be selectively combined with one another to achieve various alterations. Patent Citations Cited Patent Filing date Publication date Applicant Title * Oct 5, 2012 Jul 9, 2014 Honda Motor Co., Ltd. Vehicle vicinity monitoring device * Dec 18, 2003 Jul 14, 2005 Robert Bosch Gmbh Display device for motor vehicles has arrangement for detecting objects outside vehicle, display unit for displaying selected objects, evaluation unit for determining speed and direction of motion of at least one selected object * Dec 12, 2009 Jun 16, 2011 Volkswagen Ag Motor vehicle i.e.

Farm vehicle, has module for determining route of detected moving object in environment of vehicle and integrating route of detected moving object and/or collision warning in map representation * Title not available * Title not available * Title not available * Nov 13, 2007 Jul 24, 2008 Kabushiki Kaisha Toyota Chuo Kenkyusho Alerting illumination device * Aug 16, 2012 Feb 28, 2013 Manoj MOHAMED Method for controlling a light emission of a headlight of a vehicle * Jun 26, 2013 Dec 26, 2013 Honda Motor Co., Ltd. Light distribution controller * Sep 13, 2013 Mar 24, 2016 Hitachi Maxell, Ltd. Information display system and information display device * Nov 11, 2015 Jun 9, 2016 Hyundai Mobis Co., Ltd. Vehicle and control method thereof * Apr 9, 2010 Oct 13, 2011 Kabushiki Kaisha Toshiba Collision prevention support device * Sep 13, 2013 Mar 19, 2015 日立マクセル株式会社 Information display system, and information display device.