US20090024956A1 - Information processing apparatus and control method thereof, and computer program - Google Patents
Information processing apparatus and control method thereof, and computer program Download PDFInfo
- Publication number
- US20090024956A1 US20090024956A1 US12/170,994 US17099408A US2009024956A1 US 20090024956 A1 US20090024956 A1 US 20090024956A1 US 17099408 A US17099408 A US 17099408A US 2009024956 A1 US2009024956 A1 US 2009024956A1
- Authority
- US
- United States
- Prior art keywords
- window
- instruction
- display contents
- display
- border
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
Definitions
- the present invention relates to an information processing apparatus and control method thereof, and a computer program.
- an information processing system which can simultaneously execute a plurality of applications with user interfaces, can display a plurality of windows corresponding to the applications at the same time, and can control the respective windows to serve as independent user interfaces.
- the information processing system can display the plurality of windows by one of the following methods.
- a method of overlapping the windows at arbitrary locations according to the rule of a predetermined priority order upon displaying the respective windows is available (overlap method).
- a method of tiling the windows without overlapping each other upon displaying the respective windows is available (tiling window method).
- the overlap method is more effective.
- windows allow modification of their sizes and locations in the X and Y directions independently or simultaneously.
- the windows need to be moved or resized to avoid completely covered windows as a result of window overlap.
- a display controller of an information processing apparatus displays a window to be prioritized or a window selected by the user for access in front of all other windows in each case.
- the whole area of the window displayed in front of all other windows is displayed, and partial areas of other windows are displayed based on their overlapping states.
- This operation includes that of switching display to locate a desired window in front of all other windows and that of downsizing or moving the windows located in front of the target window.
- a window is resized by dragging one border or corner of the window.
- the window is moved by dragging a specific region which is not used for resizing.
- Upon resizing a window there is a specification prepared in advance for each window type, and display control upon resizing is performed based on the specification. More specifically, a specification that moves the display contents upon dragging when a window is resized by dragging one border or corner is available. Also, a specification that does not move the display contents irrespective of dragging is available. Furthermore, a specification that moves the display contents to have a predetermined ratio with respect to dragging or reduces or modifies them is available.
- FIG. 21 shows the configuration of a window to be displayed on a display device.
- FIG. 21 shows a window that displays a document.
- FIGS. 22A to 22D and FIGS. 23A to 23D are explanatory views of popular display control methods upon resizing a window.
- FIGS. 22A to 22D are views showing cases in which the window shown in FIG. 21 is resized by dragging one of the four borders.
- FIGS. 22A and 22C show the cases in which the window size is reduced by moving the right or bottom border. In these cases, the display contents near the border to be moved are gradually hidden.
- FIGS. 22B and 22D show the cases in which the window size is reduced by moving the left or top border. In these cases, the display contents near the right or bottom border opposite to the border to be moved are gradually hidden.
- FIGS. 23A to 23D show cases in which the window shown in FIG. 21 is resized by dragging the corners of the window. Note that the corners of the window mean the intersections of the respective borders that define the window.
- the concept of the display control shown in FIGS. 22A to 22D and FIGS. 23A to 23D is to basically preferentially display the left and up directions of the display contents of a window.
- many windows which aim at the drawing function and display of general figures do not always preferentially display the left and up directions, and different specifications are determined in advance for respective window types.
- Most windows have scroll bars to shift the position of the display contents.
- the user can move the contents that the user wants to display or access to the position within the window by operating the scroll bar.
- a certain window often configures parent and child windows defined by predetermined specifications so as to prevent related windows from being uneasy to see due to overlap display or to prevent correspondence between the related windows from confusing.
- Embodiments of the present invention provides a technique that allows the user to arbitrarily and intuitively perform an operation for moving a desired part to be prioritized to a predetermined location concurrently with resizing during resizing a window.
- an information processing apparatus comprising, display unit configured to display a window, accepting unit configured to accept a resize instruction of the displayed window together with a scroll instruction indicating whether or not to scroll display contents within the window, and control unit configured to control a size of the window and a scrolling of the display contents within the window based on contents of the resize instruction and the scroll instruction, wherein when the scroll instruction indicates that the display contents are to be scrolled, the control unit changes the window to a size indicated by the resize instruction, and scrolls the display contents according to a change amount of the window, and when the scroll instruction indicates that the display contents are not to be scrolled, the control unit changes the window to a size indicated by the resize instruction, and suppresses a scrolling of the display contents.
- a method of controlling an information processing apparatus comprising, displaying a window on a display unit, accepting a resize instruction of the displayed window together with a scroll instruction indicating whether or not to scroll display contents within the window, and controlling a size of the window and a scrolling of the display contents within the window based on contents of the resize instruction and the scroll instruction, wherein when the scroll instruction indicates that the display contents are to be scrolled, the window is changed to a size indicated by the resize instruction, and the display contents are scrolled according to a change amount of the window, and when the scroll instruction indicates that the display contents are not to be scrolled, the window is changed to a size indicated by the resize instruction, and scrolling of the display contents is suppressed.
- FIG. 1A is a block diagram showing an example of the hardware arrangement of an information processing apparatus according an embodiment of the invention
- FIG. 1B shows an example of the arrangement of a mouse as an example of an operation unit 109 according the embodiment of the invention
- FIG. 1C shows an example of the arrangement of a digital pen and tablet as an example of the operation unit 109 according the embodiment of the invention
- FIG. 2 shows an example of the configuration of a window according the embodiment of the invention
- FIGS. 3A to 3D show display examples when the user locates a cursor on a first or second region of a top border 201 or bottom border 202 of a window and drags it according to the first embodiment of the invention
- FIGS. 4A to 4D show display examples when the user locates the cursor on a first or second region of a left border 203 or right border 204 of a window and drags it according to the first embodiment of the invention
- FIG. 5A is a view for explaining ON/OFF switching of scrolling upon resizing according to the first embodiment of the invention
- FIG. 5B is a view for explaining ON/OFF switching of scrolling upon resizing when the cursor position is changed from the display state of FIG. 5A ;
- FIG. 6 is a flowchart showing an example of window resizing processing according to the first embodiment of the invention.
- FIG. 7A shows an example of a state in which the size of a window 200 matches that of a whole display screen 700 according to the second embodiment of the invention
- FIG. 7B shows an example of a state in which the size of the window 200 changes when the user locates a cursor 701 on a second region 203 b and drags it in the X direction according to the second embodiment of the invention
- FIG. 7C shows an example of a state in which the size of the window 200 changes when the user locates the cursor 701 on a first region 203 a and drags it in the X direction according to the second embodiment of the invention
- FIG. 8A shows an example of a state in which the size of the window 200 matches that of the whole display screen 700 according to the second embodiment of the invention
- FIG. 8B shows an example of a state in which the size of the window 200 changes when the user locates the cursor 701 on a second region 202 b and drags it in the Y direction according to the second embodiment of the invention
- FIG. 8C shows an example of a state in which the size of the window 200 changes when the user locates the cursor 701 on a first region 202 a and drags it in the Y direction according to the second embodiment of the invention
- FIG. 9A shows an example of a state before the beginning of dragging when the user locates a cursor P on a corner 209 (P 0 ) in a display control method according to the third embodiment of the invention
- FIG. 9B shows an example of a state in which the user moves the cursor P from the position P 0 on the corner 209 to a position P 1 in the display control method according to the third embodiment of the invention
- FIG. 9C shows an example of a state in which the user moves the cursor P from P 1 to P 2 in the display control method according to the third embodiment of the invention.
- FIG. 10 is a flowchart showing an example of window resizing processing according to the third embodiment of the present invention.
- FIG. 11 shows an example of a window including a plurality of sub-windows
- FIG. 12 is a view for explaining the fourth embodiment of the invention taking as an example a window which is divided into left and right sub-windows as first and second sub-windows;
- FIG. 13 shows an example of a change in display contents when the user moves a boundary in a window divided by one boundary according to the fourth embodiment of the invention
- FIG. 14 is a flowchart showing an example of window resizing processing according to the fourth embodiment of the invention.
- FIGS. 15A and 15B show division examples of boundaries
- FIG. 16 shows an example of display contents of a window according to the fifth embodiment of the invention.
- FIGS. 17A and 17B show display examples when the user resizes (reduces) the window by dragging one border of the window according to the fifth embodiment of the invention
- FIGS. 18A and 18B show display examples when the user resizes the window by dragging one corner of the window according to the fifth embodiment of the invention
- FIGS. 19A and 19B show display examples that allow a normally hidden part to be easier to see according to the fifth embodiment of the invention.
- FIG. 20 is a flowchart showing an example of window resizing processing corresponding to the display examples shown in FIGS. 17A and 17B ;
- FIG. 21 shows the configuration of a window displayed on a display device
- FIGS. 22A to 22D show cases in which the user resizes the window shown in FIG. 21 by dragging one of four borders;
- FIGS. 23A to 23D show cases in which the user resizes the window shown in FIG. 21 by dragging one of four corners of the window.
- the present invention provides a technique which arbitrarily controls, concurrently with dragging, whether or not to scroll the display contents of a window in response to dragging upon resizing the window by dragging an element (border, corner, boundary, etc.) which configures the window.
- the present invention proposes the following three control techniques.
- the first control technique covers a case in which the user resizes a window by mainly dragging one border of the window.
- This technique is characterized in that a direction component, which is not directly related to resizing, of those of the cursor motion upon movement is used in control.
- two different regions are formed on each border of a window, and ON/OFF of scrolling can be controlled concurrently with dragging by selecting that region while dragging.
- the second control technique is characterized in that ON/OFF of scrolling is controlled by operating a button other than that for dragging of an operation unit upon making a drag movement. This technique can be applied to both a case of dragging a corner and that of dragging a border.
- the third control technique executes control by cooperating the first and second control techniques.
- each sub-window is resized by dragging a boundary of the sub-window.
- this technique can also be applied to a window including many sub-windows.
- the first embodiment of the invention will be described hereinafter. This embodiment will explain an embodiment that relates to the first control technique.
- FIG. 1A is a block diagram showing an example of the hardware arrangement of an information processing apparatus used to implement the present invention.
- a CPU 101 executes an OS, application programs, and the like stored in an HD (hard disk) 103 , and controls to temporarily store information, files, and the like required for execution of the programs in a RAM 102 .
- the RAM 102 serves as a main memory, work area, and the like of the CPU 101 .
- the HD 103 stores the application programs, driver programs, the OS, control programs, a processing program required to execute processing according to this embodiment, and the like.
- a display unit 104 displays information according to commands input from an operation unit 109 , externally acquired information, and the like.
- the display unit 104 may adopt any display method of CRT type, liquid crystal type, PDP type, SED type, and organic EL type.
- the display unit 104 displays a window according to this embodiment.
- a network interface (to be referred to as “I/F” hereinafter) 105 is a communication interface used to connect a network.
- a ROM 106 stores programs such as a basic I/O program and the like.
- An external storage drive 107 can load programs and the like stored in a medium 108 to this computer system.
- the medium 108 as a storage medium stores predetermined programs and related data.
- the operation unit 109 is a user interface used to accept operations and instructions from an operator of this apparatus, and comprises a keyboard, mouse, digital pen, and the like.
- a system bus 110 controls the flow of data in the apparatus.
- a mouse, digital pen, and tablet as examples of the operation unit 109 can have the arrangements shown in, for example, FIGS. 1B and 1C .
- the mouse and tablet are connected to an information processing apparatus 100 using USB connections, and can serve as the operation unit 109 .
- a mouse 120 shown in FIG. 1B can constitute a part of the operation unit 109 .
- the mouse 120 has the left button 121 and the right button 122 .
- the bottom surface of the mouse 120 comprises a structure for detecting a moving amount and direction of the mouse 120 using a mechanical mechanism using a ball or an optical mechanism using an optical sensor.
- a digital pen 130 and tablet 140 shown in FIG. 1C can constitute a part of the operation unit 109 .
- the digital pen 130 can comprise a tip switch 131 at the pen tip, and a side switch 132 on the side surface.
- the tip switch 131 corresponds to the left button 121 of the mouse 120
- the side switch 132 corresponds to the right button 122 of the mouse 120 .
- the tip switch 131 can be turned on by pressing it against the tablet 140 .
- the side switch 132 can be turned on when the operator holds it down with the finger.
- the tablet 140 comprises a pressure-sensitive or electrostatic contact sensor, and can detect the position of the digital pen 130 when the tip of the digital pen 130 is pressed against the tablet 140 .
- the tablet 140 can detect the moving direction and amount of the digital pen 130 .
- the tablet 140 may be integrated with the display unit 104 .
- FIG. 2 shows an example of the configuration of a window according to the embodiment of the invention.
- a window 200 has a rectangular shape, and is defined by four borders, that is, a top border 201 , bottom border 202 , left border 203 , and right border 204 .
- the window 200 has four corners 207 , 208 , 209 , and 210 .
- the corner 207 is defined as an intersection between the top border 201 and left border 203
- the corner 208 is defined as an intersection between the left border 203 and bottom border 202
- the corner 209 is defined as an intersection between the bottom border 202 and right border 204
- the corner 210 is defined as an intersection between the right border 204 and top border 201 .
- each border is divided into two different regions, that is, first and second region. More specifically, the first region is located to include the center of the border, and the second region is located to include the end portions of the border, and to sandwich the first region. For example, on the top border 201 , a first region 201 a including the center of the border is located to be sandwiched between second regions 201 b including the end portions of the border.
- each border may be equally divided into three or the first region may be slightly longer or shorter than the length obtained when the border is equally divided into three regions. This embodiment will exemplify a case in which one border is equally divided into three regions.
- the window 200 includes a title bar 205 and display area 206 .
- the title bar 205 displays information corresponding to the content displayed in the display area 206 .
- the title bar 205 displays a document name.
- the display area 206 displays the contents of data to be displayed.
- the display area 206 displays the contents of a document for a document file, or displays a corresponding image or graphic information for an image or graphic file.
- the window can be resized by dragging one of the four borders of the window based on the operation of the operation unit 109 , and moving the selected border in a direction perpendicular to that border. That is, in this embodiment, the drag operation corresponds to a window resize instruction operation.
- the window resize instruction including a scroll instruction indicating whether or not to scroll the display contents within the window, is accepted.
- this embodiment uses “drag” as a term that represents the concept to be described below.
- the mouse shown in FIG. 1B is used as the operation unit 109 to have default settings of Microsoft Windows®.
- the display position of a cursor displayed on the screen of the display unit 104 is controlled in response to the movement of the mouse 120 .
- the user presses the left button 121 while the cursor is located on a target to be selected that target to be selected is highlighted.
- moving the cursor by moving the mouse 120 in this state will be referred to as “dragging”.
- FIGS. 3A to 3D show display examples according to this embodiment when the user drags the cursor while locating it on the first or second region of the top border 201 or bottom border 202 of the window in the display state of FIG. 2 .
- FIGS. 4A to 4D show display examples according to this embodiment when the user drags the cursor while locating it on the first or second region of the left border 203 or right border 204 of the window in the display state of FIG. 2 .
- FIGS. 3A and 3C and FIGS. 4A and 4C it can also be considered as if the display contents were moving in correspondence with the movement of the border.
- such change in display contents will be referred to as “resizing with scrolling”.
- a state in which the display contents of the display area 206 are moved and displayed in correspondence with the movement of the border will be referred to as “with scrolling”, “the display contents are scrolled”, or “scrolling the display contents”.
- the display contents near a border (first border) where the cursor is located are changed. More specifically, the display contents are changed so as to be hidden in turn by the first border. On the other hand, the display contents near a border (second border) opposite to the border (first border) where the cursor is located remain unchanged.
- FIGS. 3B and 3D and FIGS. 4B and 4D it can also be considered as if the display contents are fixed with respect to the movement of the border. In this embodiment, such change in display contents will be referred to as “resizing without scrolling”.
- a state in which the display contents on the display area 206 are fixedly displayed with respect to the whole display screen will be referred to as “without scrolling”, “the display contents are not scrolled”, or “not scrolling the display contents”.
- “resizing with scrolling” and “resizing without scrolling” can be executed during resizing in a continuous drag operation. That is, the resizing with scrolling and that without scrolling can be switched in real time during a continuous, single drag operation. Hence, the user can resize the window while adjusting the display position.
- FIGS. 5A and 5B are views for explaining that switching according to this embodiment.
- the width and height directions of the window 200 respectively match the X and Y directions of an X-Y coordinate system 502 set on the display screen where the window 200 is displayed.
- P 0 represents an initial position of the cursor.
- FIG. 5B expresses a state in which a position P(Px, Py) of the cursor is continuously changed like P 0 ⁇ P 1 ⁇ P 2 ⁇ P 3 or P 5 ⁇ P 6 ⁇ P 7 ⁇ P 8 during a single drag operation.
- P(Px, Py) is a coordinate value based on the X-Y coordinate system 502 set on the display screen.
- this process can be expressed in association with the point Q by:
- ⁇ Qx is a difference between Qx at the beginning of the resizing without scrolling, and Qx after the window is resized.
- this process can be expressed in association with the position Q by:
- ⁇ Qx is a difference between Qx at the beginning of the resizing with scrolling, and Qx after the window size is resized.
- ⁇ Cx and ⁇ Px are differences between Cx and Px at the beginning of the resizing with scrolling, and Cx and Px after the window is resized. Note that these differences correspond to change amounts of the window 200 in the X direction.
- the “resizing with scrolling” and “resizing without scrolling” are executed concurrently according to a change in position of the cursor in the Y direction.
- FIG. 6 is a flowchart showing an example of the window resizing processing according to the first embodiment.
- the processing corresponding to the flowchart shown in FIG. 6 is implemented when the CPU 101 reads out a corresponding processing program stored in the HD 103 onto the RAM 102 and executes that program to control respective components.
- FIG. 6 describes a case wherein the user resizes the window by dragging the left border 203 of the window 200 .
- the embodiment of the invention is not limited to the case wherein the left border 203 is dragged. That is, the same processing as in FIG. 6 can resize the window by dragging the top border 201 , bottom border 202 , and right border 204 .
- step S 601 the CPU 101 acquires operation information (information of a first instruction operation) of a first button of the mouse 120 or digital pen 130 of the operation unit 109 , and information (moving information) of the moving direction and amount of the mouse 120 or digital pen 130 .
- the first button (first operation unit) corresponds to the left button 121 of the mouse 120 if the mouse 120 is used in the default settings of Microsoft Windows®.
- the first button corresponds to the tip switch 131 at the pen tip of the digital pen 130 .
- the CPU 101 determines in step S 602 based on the operation information of the first button acquired in step S 601 whether or not the first button is switched from OFF to ON. If it is determined that the first button is switched to ON (“YES” in step S 602 ), the process advances to step S 603 . On the other hand, if it is determined that the first button is kept OFF without being switched to ON (“NO” in step S 602 ), the process returns to step S 601 to continue the processing.
- step S 603 the CPU 101 calculates the position coordinate of the cursor (cursor position coordinate) based on the moving amount information acquired in step S 601 to determine on which border of the window 200 the cursor is located. This determination process can be attained by seeing which of predetermined regions set based on the first and second regions of the borders that configure the window 200 includes the cursor position coordinate.
- step S 603 If it is determined that the cursor is located on the left border 203 of the window 200 (“left border” in step S 603 ), it can be determined that the user begins to drag the left border 203 . In this case, the process advances to step S 604 . On the other hand, if the cursor is located on one of the remaining borders (on one of the top border 201 , bottom border 202 , and right border 204 ) (“another border” in step S 603 ), it can be determined that the user begins to drag another border. In this case, the process advances to step S 605 . In step S 605 , the CPU 101 executes window resizing processing by dragging of another border.
- step S 604 the CPU 101 determines the cursor position coordinate P(Px, Py) at the beginning of dragging, as shown in FIG. 5A , for the window which begins to be dragged. Also, the CPU 101 determines the position C(Cx, Cy) of the corner 208 at the lower end of the left border 203 and the position Q(Qx, Qy) of the arbitrary display contents, as shown in FIG. 5B .
- step S 606 the CPU 101 further acquires the operation information of the first button and the moving amount information, and updates the cursor position coordinate P(Px, Py) based on the moving amount information.
- the CPU 101 determines in step S 607 whether or not the first button is kept ON. If the first button is not kept ON but is switched to OFF (“NO” in step S 607 ), this processing ends. In this case, a so-called “drop” operation is made.
- step S 608 the CPU 101 sets the X position (Cx) of the left border 203 of the window 200 to match the X component (Px) of the cursor position coordinate updated in step S 606 . In this way, the position of the left border 203 follows the movement of the cursor in the X direction.
- the CPU 101 determines in step S 609 based on the cursor position coordinate updated in step S 606 whether or not the cursor is located on the first region. If it is determined that the cursor is located on the first region (“YES” in step S 609 ), the process advances to step S 610 . On the other hand, if it is determined that the cursor is located on the second region (“NO” in step S 609 ), the process advances to step S 611 .
- step S 610 the CPU 101 sets the moving amount ⁇ Qx of the position Q of the arbitrary display contents in the X direction to be equal to the moving amount ⁇ Px of the cursor in the X direction, so as to scroll the display contents upon resizing the window.
- step S 611 the CPU 101 sets the moving amount ⁇ Qx to be zero so as to suppress scrolling of the display contents upon resizing the window.
- step S 612 the CPU 101 updates display of the cursor and window 200 based on the position of the left border 203 determined in step S 608 and the moving amount ⁇ Qx determined in step S 610 or S 611 . After that, the process returns to step S 606 to continue the processing.
- step S 606 represents cursor movement during dragging, that is, that dragging is continued and resizing of the window is in progress during this loop.
- this represents that the drop operation is made to settle the window size.
- each border of the window is divided into two different regions, and the change method of the display contents within the window can be controlled based on the selected region. Since the region can be selected in real time during resizing of the window, the position of the display contents within the window can be controlled simultaneously with resizing. In this way, a desired display result can be obtained by a series of operations, thus improving the work efficiency.
- the display states 1 and 3 will be compared. In case of the display state 1 , since the window itself is fixed, there is no trouble upon handling the window. However, in order to refer to another window, a switching operation for canceling the full screen display state is required.
- window display of the first embodiment is applied to so-called “full screen display”.
- a window is maximized in the X and Y directions of the display screen of the display unit 104 , and is fixed in size.
- the window cannot be resized unless the full screen display state is canceled.
- a window is maximized in only one of the X and Y directions within the display screen, and is fixed in size in that direction.
- one border is fixed to the end of the display screen, and only the other border is movable by dragging. By operating this border that can be dragged, the window can be resized in one direction.
- FIGS. 7A to 7C show examples of full screen display according to this embodiment.
- reference numeral 700 denotes a whole display screen of the display unit 104 . Since the window configuration is the same as that in FIG. 2 of the first embodiment, corresponding reference numerals will be used.
- a left border 203 of a window 200 includes first region 203 a and second regions 203 b . The user can drag the first and second regions 203 a and 203 b using a cursor 701 .
- the directions of the whole display screen 700 and window 200 are determined based on an X-Y coordinate system 502 .
- FIG. 7A shows a state in which the size of the window 200 matches that of the whole display screen 700 . That is, FIG. 7A corresponds to the full screen display state.
- FIG. 7B shows a state in which the window 200 is resized when the user locates the cursor 701 on the second region 203 b and drags it in the X direction.
- the size of the window 200 changes in only the X direction.
- a right border 204 opposite to the dragged left border 203 is fixed to the end of the display area, and only the left border 203 can be dragged.
- the window is resized in one direction.
- the window 200 is fixed in a maximum size in the Y direction perpendicular to the dragging direction. Note that in case of FIG. 7B , since the second region 203 b is used, resizing without scrolling described in the first embodiment is executed.
- FIG. 7C shows a state in which the window is resized when the user locates the cursor 701 on the first region 203 a and drags it in the X direction.
- the size of the window 200 changes in only the X direction.
- the right border 204 opposite to the dragged left border 203 is fixed to the end of the display area, and only the left border 203 can be dragged.
- the window 200 is resized in one direction. Note that the window 200 is fixed in a maximum size in the Y direction perpendicular to the dragging direction. Note that resizing with scrolling described in the first embodiment is executed since the first region 203 a is used at this time.
- FIGS. 7A to 7C the left border 203 is used as a border having a function of resizing the window.
- any of the remaining three borders which configure the window 200 may be used as a border having a function of resizing the window.
- FIGS. 8A to 8C show a case using a bottom border 202 . That is, FIG. 8A shows an example of a state in which the size of the window 200 according to this embodiment matches that of the whole display screen 700 .
- FIG. 8B shows an example of a state in which the window 200 is resized when the user locates the cursor 701 on a second region 202 b and drags it in the Y direction according to this embodiment.
- FIGS. 8A to 8C shows an example of a state in which the window is resized when the user locates the cursor 701 on a first region 202 a and drags it in the Y direction according to this embodiment.
- FIGS. 8A to 8C shows an example of a state in which the window is resized when the user locates the cursor 701 on a first region 202 a and drags it in the Y direction according to this embodiment.
- FIGS. 8A to 8C shows an example of a state in which the window is resized when the user locates the cursor 701 on a first region 202 a and drags it in the Y direction according to this embodiment.
- FIGS. 8A to 8C shows an example of a state in which the window is resized when the user locates the cursor 701 on a first region 202 a and drags it in the Y direction according to this embodiment.
- a border that is movable can also be referred to as a “movable border”
- a border located at a position opposite to the movable border can also be referred to as a “first fixed border (opposing fixed border)”
- the remaining two borders can also be referred to as a “second fixed border” and “third fixed border”.
- the left border 203 corresponds to the movable border
- the right border 204 corresponds to the first fixed border (opposing fixed border)
- a top border 201 and the bottom border 202 respectively correspond to the second and third fixed borders.
- the bottom border 202 corresponds to the movable border
- the top border 201 corresponds to the first fixed border (opposing fixed border)
- the left and right borders 203 and 204 respectively correspond to the second and third fixed borders.
- the display position on a display area 206 of the window 200 can be controlled in the same manner as in the first embodiment. However, an only difference is that the first and second regions given to all the four borders in the first embodiment are limited to only one border in this embodiment.
- the window according to this embodiment is maintained in a maximized state in one of the X and Y directions (width and height directions). Therefore, upon reordering a plurality of windows, a one-dimensional positional relationship need only be considered. As a result, compared to reordering of windows in consideration of a two-dimensional positional relationship, an operation can be simplified very much, thus greatly eliminating complexity.
- the window can be resized, a window hidden below the upper window can be displayed compared to a case in which a window is completely maximized in both the X and Y directions, thus improving convenience.
- window can be defined as a fourth window display state in addition to the aforementioned window display states 1 to 3 .
- the point of this embodiment is not limited to that the window can be resized in one direction in the full screen display state, but it lies in that the display position of the display contents within the window can be controlled at the time of the drag operation in combination with the invention according to the first embodiment.
- the aforementioned first embodiment has proposed the display control method upon resizing the window by dragging one of the borders which configure the window.
- This method is effective in the case in which the window is often resized by mainly dragging the border.
- this method is very effective for the window which is maximized in only one direction, as described in the second embodiment.
- a normal window can be resized by dragging one of its corners, as shown in FIGS. 23A to 23D . Whether each user drags the border or corner to resize such normal window depends on favor of the user, the display contents of individual applications, individual work contents, and the like.
- This embodiment proposes a method that can control ON/OFF of scrolling during resizing in real time as in the first embodiment even upon resizing a window by dragging its corner.
- this embodiment uses ON/OFF of a second button of a mouse 120 or digital pen 130 of an operation unit 109 in switching control between resizing with scrolling and that without scrolling upon resizing a window.
- the second button (second operation unit) corresponds to a right button 122 of the mouse 120 in the default settings of Microsoft Windows®.
- the second button corresponds to a side switch 132 on the side surface of the digital pen 130 .
- the second button may be assigned to a specific key such as a control key.
- FIG. 9A shows a state before the beginning of dragging, in which the user locates a cursor P on a corner 209 (P 0 ).
- FIG. 9B shows a state in which the user moves the cursor P from the position P 0 to a position P 1 of the corner 209 .
- the user turns on the second button to execute the resizing with scrolling.
- FIG. 9C shows a state in which the user moves the cursor P from P 1 to P 2 . Upon this cursor movement, the user turns off the second button to execute the resizing without scrolling.
- a dotted line 901 in FIGS. 9B and 9C indicates the size of a window 200 before resizing.
- the contents within a dotted line 902 indicate the display contents falling outside the window 200 after resizing.
- the first button is kept ON during dragging irrespective of ON/OFF of the second button.
- FIGS. 9A to 9C are views for explaining the display control method of this embodiment by adopting the configuration of the window corresponding to FIG. 2 , but they omit descriptions of first and second regions for the sake of simplicity. Note that the third embodiment can be practiced in combination with the first embodiment, and this embodiment can be applied to the window shown in FIG. 2 , which has the first and second regions, just in case.
- This embodiment can assure similar operations on any of four corners 207 to 210 of the window 200 , and the following description will be given taking as an example a case in which the user drags the lower right corner 209 .
- the position C changes like C 0 , C 1 , and C 2
- the position B changes like B 0 , B 1 , and B 2
- the position Q changes like Q 0 , Q 1 , and Q 2 in correspondence with the movement of the cursor position from P 0 to P 1 and to P 2 .
- FIG. 10 is a flowchart showing an example of the window resizing processing according to the third embodiment.
- the processing corresponding to the flowchart shown in FIG. 10 is implemented when a CPU 101 reads out a corresponding processing program stored in an HD 103 onto a RAM 102 and executes that program to control respective components.
- FIG. 10 describes a case in which the user resizes the window by dragging the lower right corner 209 of the window 200 .
- the embodiment of the invention is not limited to the case in which the lower right corner 209 is dragged. That is, the same processing as in FIG. 10 can resize the window by dragging the upper left corner 207 , lower left corner 208 , and upper right corner 210 .
- step S 1001 the CPU 101 acquires operation information (information of a first instruction operation) of a first button of the mouse 120 or digital pen 130 of the operation unit 109 , and information (moving information) of the moving direction and amount of the mouse 120 or digital pen 130 .
- the first button corresponds to the left button 121 of the mouse 120 if the mouse 120 is used in the default settings of Microsoft Windows®.
- the first button corresponds to a tip switch 131 at the pen tip of the digital pen 130 .
- the CPU 101 determines in step S 1002 based on the operation information of the first button acquired in step S 1001 whether or not the first button is switched from OFF to ON. If it is determined that the first button is switched to ON (“YES” in step S 1002 ), the process advances to step S 1003 . On the other hand, if it is determined that the first button is kept OFF without being switched to ON (“NO” in step S 1002 ), the process returns to step S 1001 to continue the processing.
- step S 1003 the CPU 101 calculates the position coordinate of the cursor P (cursor position coordinate) based on the moving amount information acquired in step S 1001 to determine on which corner of the window 200 the cursor is located. This determination process can be attained by seeing which of predetermined regions set based on the corners that configure the window 200 includes the cursor position coordinate.
- step S 1003 If it is determined that the cursor is located on the lower right corner 209 of the window 200 (“lower right corner 209 ” in step S 1003 ), it can be determined that the user begins to drag the lower right corner 209 . In this case, the process advances to step S 1004 . On the other hand, if the cursor is located on one of the remaining corners (on one of the corners 207 , 208 , and 210 ) (“another ” in step S 1003 ), it can be determined that the user begins to drag another corner. In this case, the process advances to step S 1005 . In step S 1005 , the CPU 101 executes window resizing processing by dragging of another corner.
- step S 1004 the CPU 101 determines the position coordinates P(Px, Py), C(Cx, Cy), B(Bx, By), and Q(Qx, Qy) at the beginning of dragging, as shown in FIG. 9A , for the window which begins to be dragged. Note that the definitions of respective coordinates are the same as those described above.
- step S 1006 the CPU 101 further acquires the information of the first instruction operation and moving amount information, and also operation information of a second button (information of a second instruction operation) of the mouse 120 or digital pen 130 of the operation unit 109 . Also, the CPU 101 updates the cursor position coordinate P(Px, Py) based on the moving amount information. The CPU 101 then determines in step S 1007 whether or not the first button is kept ON. If the first button is not kept ON but is switched to OFF (“NO” in step S 1007 ), this processing ends. In this case, a so-called “drop” operation is made.
- step S 1008 the CPU 101 sets the position C(Cx, Cy) of the lower right corner 209 of the window 200 to match the cursor position P(Px, Py) updated in step S 1006 . In this way, the position of the lower right corner 209 follows the cursor movement.
- the CPU 101 determines in step S 1009 based on the operation information of the second button acquired in step S 1006 whether or not the second button is ON. If it is determined that the second button is ON (“YES” in step S 1009 ), the process advances to step S 1010 . On the other hand, if it is determined that the second button is OFF (“NO” in step S 1009 ), the process advances to step S 1011 .
- step S 1010 the CPU 101 sets the moving amount ⁇ Q( ⁇ QX, ⁇ Qy) of the position Q of the arbitrary display contents to be equal to the moving amount ⁇ P( ⁇ Px, ⁇ Py) of the cursor. In this way, the display contents are scrolled by a size corresponding to the change amounts of the window 200 in the X and Y directions.
- step S 1011 the CPU 101 sets the moving amount ⁇ Q to be (0, 0). In this case, the display contents are not scrolled.
- step S 1012 the CPU 101 updates display of the cursor and window 200 based on the position of the lower right corner 209 determined in step S 1008 and the moving amount ⁇ Q determined in step S 1010 or S 1011 . After that, the process returns to step S 1006 to continue the processing.
- step S 1006 represents cursor movement during dragging, that is, that dragging is continued and resizing of the window is in progress during this loop.
- this represents that the drop operation is made to settle the window size.
- the two different operation buttons of the operation unit 109 are used, and the change method of the display contents within the window can be controlled based on combinations of the button operations. Since the combinations of the button operations can be changed in real time during resizing of the window, the position of the display contents within the window can be controlled simultaneously with resizing. In this way, a desired display result can be obtained by a series of operations, thus improving the work efficiency.
- the window is resized by mainly dragging the corner of the window.
- the display control method according to this embodiment can be applied to a case wherein the window is resized by dragging its border.
- ON/OFF of scrolling upon resizing can be controlled by the same operations in case of dragging the corner and that of dragging the border.
- the first control technique is effective upon attaching importance to resizing by dragging a border, and is especially effective in case of the second embodiment. In consideration of only the case of dragging the border, the first control technique can achieve the desired resizing by a simpler operation than the second control technique.
- the second control technique is effective for the case including probability of dragging of both the corner and border, and the case that also attaches importance to dragging of the corner.
- the desired resizing can be achieved by common operation to the case of dragging the corner and that of dragging the border.
- This embodiment will explain display control of the present invention, which is applied to a case in which a window includes a plurality of sub-windows, and each sub-window is resized by dragging a boundary between the neighboring sub-windows.
- Some applications display using a window defined by a single area, and some other applications display using a window including a plurality of sub-windows.
- FIG. 11 shows an example of the latter application.
- the display efficiency can be improved compared to a case of a single window, and a more comfortable user interface can be provided.
- the left or top part of the display contents in each sub-window is preferentially displayed in some cases. This is based on the same situation as a window defined by a single area, that is, the idea that the first character of a sentence and the first line of a page are to be preferentially displayed.
- this embodiment provides a display control method that allows to concurrently switch ON/OFF of scrolling of sub-windows on two sides of a boundary in real time during resizing upon resizing by dragging the boundary.
- the need for fixing ON/OFF of scrolling in advance can be obviated unlike in the related art.
- Control mode 1 resizing with scrolling of both the sub-windows on the first and second sides
- Control mode 2 resizing with scrolling of the sub-window on the first side and that without scrolling of the sub-window on the second side
- Control mode 3 resizing without scrolling of the sub-window on the first side and that with scrolling of the sub-window on the second side
- Control mode 4 resizing without scrolling of both the sub-windows on the first and second sides
- the relationship between the sub-windows on the first and second sides can be considered as that between neighboring sub-windows on, for example, the left and right sides or the upper and lower sides of the boundary.
- FIG. 12 is a view for explaining this embodiment taking as an example a window which is divided into left and right sub-windows as the first and second sub-windows. Note that the boundary that the user can drag is one boundary per drag operation, and the same display control applies to a window divided into upper and lower sub-windows as in that divided into the left and right sub-windows.
- a window 1200 is defined by borders 1201 , 1202 , 1203 , and 1204 , and has sub-windows 1207 , 1208 , and 1209 partitioned by boundaries 1205 and 1206 .
- Each of the boundaries 1205 and 1206 is divided into two regions.
- the upper half region is called a first region
- the lower half region is called a second region.
- the division method is merely an example, and is not limited to that shown in FIG. 12 .
- the same division method of each border in the first embodiment may be adopted.
- the position of a cursor P can be expressed by P(Px, Py) based on an X-Y coordinate system 502 set on the display screen on which the window 1200 is displayed.
- Let LBy be the length of the boundary 1205 within the window 1200
- BL(BLx, BLy) be the position of an intersection between the lower end of the boundary 1205 and the lower border 1202 .
- condition 1 a condition required to locate the cursor P on the first region is described by:
- condition 2 a condition required to locate the cursor P on the second region is described by:
- QL(QLX, QLy) be the position of arbitrary display contents within the sub-window 1207 on the left side of the boundary 1205
- QR(QRx, QRy) be the position of arbitrary display contents within the sub-window 1208 on the right side.
- the boundary 1205 will be described below. However, the scroll control of the display contents upon resizing the sub-windows with reference to the boundary 1206 can be similarly executed.
- ⁇ QLx and ⁇ QRx are differences of QLx and QRx before and after resizing of the sub-windows.
- ⁇ QLx and ⁇ QRx are differences of QLx and QRx before and after resizing of the sub-windows.
- ⁇ Px is a difference of Px before and after resizing of the sub-windows. Note that these differences correspond to the change amounts of the boundary 1205 in the X direction.
- the four types of resizing control of the control modes 1 to 4 are switched by combining dragging of the cursor which is located on either the first or second region, and ON/OFF of the second button operation.
- control mode 1 the cursor located on the first region is dragged, and the second button is ON.
- the cursor located on the first region is dragged, and the second button is OFF.
- the cursor located on the second region is dragged, and the second button is ON.
- the display control method simultaneously uses control based on the position of the cursor in the Y direction used in the first and second embodiments, and control based on the second button of the operation unit 109 used in the third embodiment in cooperation with each other.
- switching between resizing with scrolling and that without scrolling for each of the sub-windows on the two sides is controlled concurrently during the single, continuous drag operation and cursor movement.
- the start and continuation of dragging are controlled by ON/OFF of the first button of the operation unit 109 as in the above embodiments.
- FIG. 13 shows an example of a change in display contents when the user moves a boundary on a window divided by the single boundary.
- reference numeral 1301 denotes a state before beginning of dragging.
- a left sub-window displays alphabetical letters “ABD”
- a right sub-window displays three rows of numerals “1” to “9”.
- a display state of a window 1303 is set. Since both the left and right sub-windows are scrolled, the display contents near the boundary remain unchanged, but those near the left and right borders of the window are changed.
- a display state of a window 1304 is set. At this time, only the right sub-window is scrolled. Hence, alphabetical letters “FG” hidden on the left sub-window are newly displayed near the boundary. On the other hand, on the right sub-window, numerals “1 2 3” near the right border of the window, which were displayed on the window 1303 , are hidden.
- a display state like a window 1305 is set. At this time, only the left sub-window is scrolled. Hence, on the left sub-window, alphabetical letters “AB” hidden near the left border of the window are displayed. On the other hand, since the right sub-window is not scrolled, numerals “3 4 5 6” are hidden by the boundary.
- FIG. 14 is a flowchart showing an example of the window resizing processing according to the fourth embodiment.
- the processing corresponding to the flowchart shown in FIG. 14 is implemented when a CPU 101 reads out a corresponding processing program stored in an HD 103 onto a RAM 102 and executes that program to control respective components.
- FIG. 14 describes a case in which the user resizes the sub-windows by dragging the boundary 1205 of the window 1200 .
- the embodiment of the invention is not limited to the case in which the boundary 1205 is dragged. That is, the same processing as in FIG. 14 can resize the sub-windows by dragging the boundary 1206 or another boundary.
- step S 1401 the CPU 101 acquires operation information (information of a first instruction operation) of a first button of a mouse 120 or digital pen 130 of the operation unit 109 , and information (moving information) of the moving direction and amount of the mouse 120 or digital pen 130 .
- the first button corresponds to a left button 121 of the mouse 120 if the mouse 120 is used in the default settings of Microsoft Windows®.
- the first button corresponds to a tip switch 131 at the pen tip of the digital pen 130 .
- the CPU 101 determines in step S 1402 based on the operation information of the first button acquired in step S 1401 whether or not the first button is switched from OFF to ON. If it is determined that the first button is switched to ON (“YES” in step S 1402 ), the process advances to step S 1403 . On the other hand, if it is determined that the first button is kept OFF without being switched to ON (“NO” in step S 1402 ), the process returns to step S 1401 to continue the processing.
- step S 1403 the CPU 101 calculates the position coordinate of the cursor P (cursor position coordinate) based on the moving amount information acquired in step S 1401 to determine on which boundary of the window 1200 the cursor is located. This determination process can be attained by seeing which predetermined region set based on the boundaries included in the window 1200 includes the cursor position coordinate.
- step S 1403 If it is determined that the cursor is located on the boundary 1205 of the window 1200 (“boundary 1205 ” in step S 1403 ), it can be determined that the user begins to drag the boundary 1205 . In this case, the process advances to step S 1404 . On the other hand, if the cursor is located on one of the remaining boundaries (on the boundary 1206 or the like) (“another” in step S 1403 ), it can be determined that the user begins to drag another boundary. In this case, the process advances to step S 1405 . In step S 1405 , the CPU 101 executes window resizing processing by dragging of another boundary.
- step S 1404 the CPU 101 determines the position coordinates P(Px, Py), BL(BLx, BLy), QL(QLx, QLy), and QR(QRx, QRy) at the beginning of dragging, as shown in FIG. 12 , for the window which begins to be dragged. Note that the definitions of respective coordinates are the same as those described above.
- step S 1406 the CPU 101 further acquires the information of the first instruction operation and moving amount information, and also operation information of the second button (information of a second instruction operation) of the mouse 120 or digital pen 130 of the operation unit 109 . Also, the CPU 101 updates the cursor position coordinate P(Px, Py) based on the moving amount information. The CPU 101 then determines in step S 1407 whether or not the first button is kept ON. If the first button is not kept ON but is switched to OFF (“NO” in step S 1407 ), this processing ends. In this case, a so-called “drop” operation is made.
- step S 1408 the CPU 101 sets the X component BLx of the end position BL of the boundary 1205 to match the X component Px of the cursor position P updated in step S 1406 . In this way, the position of the boundary 1205 follows the cursor movement.
- the CPU 101 determines in step S 1409 based on the coordinate Py of the cursor position in the Y direction obtained in step 1406 on which of the first and second regions the cursor P is located and based on the operation information of the second button if the second button is ON.
- step S 1410 If the cursor P is located on the first region, and the second button is ON, the process advances to step S 1410 . If the cursor P is located on the first region, and the second button is OFF, the process advances to step S 1411 . Furthermore, if the cursor P is located on the second region, and the second button is ON, the process advances to step S 1412 . Moreover, if the cursor P is located on the second region, and the second button is OFF, the process advances to step 1413 .
- step S 1410 the CPU 101 sets the moving amount ⁇ QLx of the position QL of the arbitrary display contents in the X direction on the left sub-window 1207 as the first side of the boundary 1205 to be equal to the moving amount ⁇ Px of the cursor P in the X direction. Also, the CPU 101 sets the moving amount ⁇ QRx of the position QR of the arbitrary display contents in the X direction on the right sub-window 1208 as the second side of the boundary 1205 to be equal to the moving amount ⁇ Px of the cursor P in the X direction. As a result, the display contents on the sub-windows are scrolled by a size corresponding to the change amount of the boundary 1205 in the X direction.
- step S 1411 the CPU 101 sets the moving amount ⁇ QLx of the position QL of the arbitrary display contents in the X direction on the left sub-window 1207 as the first side of the boundary 1205 to be equal to the moving amount ⁇ Px of the cursor P in the X direction. Also, the CPU 101 sets the moving amount ⁇ QRx of the position QR of the arbitrary display contents in the X direction on the right sub-window 1208 as the second side of the boundary 1205 to be zero. In this way, the display contents on the left sub-window 1207 are scrolled by a size corresponding to the change amount of the boundary 1205 in the X direction. On the other hand, the display contents on the right sub-window 1208 are not scrolled.
- step S 1412 the CPU 101 sets the moving amount ⁇ QLx of the position QL of the arbitrary display contents in the X direction on the left sub-window 1207 as the first side of the boundary 1205 to be zero. Also, the CPU 101 sets the moving amount ⁇ QRx of the position QR of the arbitrary display contents in the X direction on the right sub-window 1208 as the second side of the boundary 1205 to be equal to the moving amount APx of the cursor P in the X direction. In this way, the display contents on the left sub-window 1207 are not scrolled. On the other hand, the display contents on the right sub-window 1208 are scrolled by a size corresponding to the change amount of the boundary 1205 in the X direction.
- step S 1413 the CPU 101 sets the moving amount ⁇ QLx of the position QL of the arbitrary display contents in the X direction on the left sub-window 1207 as the first side of the boundary 1205 to be zero. Also, the CPU 101 sets the moving amount ⁇ QRx of the position QR of the arbitrary display contents in the X direction on the right sub-window 1208 as the second side of the boundary 1205 to be zero. In this way, the display contents on the sub-windows 1207 and 1208 are not scrolled.
- step S 1414 the CPU 101 updates displays of the cursor and window 1200 .
- the CPU 101 executes this updating process based on the position BLx of the boundary 1205 determined in step S 1408 , and the moving amounts ⁇ QLx and ⁇ QRx determined in any of steps S 1410 to S 1413 . After that, the process returns to step S 1406 to continue the processing.
- step S 1406 represents cursor movement during dragging, that is, that dragging is continued and resizing of the window is in progress during this loop.
- this represents that the drop operation is made to settle the window size.
- this embodiment has been described. Note that the same display control method according to this embodiment can be applied to not only the window of the configuration shown in FIGS. 12 and 13 but also to a window divided into upper and lower sub-windows. Furthermore, the method of this embodiment can be applied to a window divided into upper, lower, left and right sub-windows, as shown in FIG. 11 .
- the window shown in FIG. 11 is normally configured, so that a boundary which divides the upper and lower sub-windows and that which divides the right and left sub-windows are independently operable. Hence, by executing the same processing as that shown in FIG. 14 in turn to these boundaries, the display control method of this embodiment can be applied.
- the first and second regions are required to be defined on each boundary.
- the length of each boundary may be equally divided.
- a part divided by an intersection of the vertical and horizontal boundaries may be equally divided.
- the lengths of the first and second regions change sequentially depending on the position of the intersection.
- the change method of the display contents in the sub-windows can be controlled simultaneously with resizing of the sub-windows. In this way, a desired display result can be obtained by a series of operations, thus improving the work efficiency.
- This embodiment proposes display control which is executed in association with the scrolling ON/OFF control method upon resizing a window, that is proposed by the present invention.
- display control executed upon resizing includes control for switching ON/OFF of scrolling or a scroll ratio of the display contents according to dragging of a border or corner, control for reducing or enlarging the display contents according to dragging of a border or corner, or the like.
- the display contents are scrolled upon resizing, the contents on an area opposite to the dragged part are hidden.
- the display contents are not scrolled upon resizing, the contents of an area near the dragged part are hidden.
- the “area opposite to the dragged part” is an area near a border opposite to the dragged border, or an area near two borders that do not contact the dragged corner.
- the “area near the dragged part” is an area near the dragged border or an area near two borders that contact the dragged corner.
- object images such as characters, patterns, photos, and the like, which are located on an area to be normally hidden, are displayed while being jammed into the area to be hidden, so as to allow the user to see them.
- display contents shown in FIG. 16 are assumed. This may be a normal window described in the first embodiment or may be a window which is described in the second embodiment, and is always maximized in one direction (Y direction) within the display screen.
- a left border 1601 is movable by dragging, and a window 1600 can be resized by moving this border 1601 .
- FIGS. 17A and 17B show display examples when the user resizes (reduces) the window by dragging the border in this embodiment.
- FIG. 17A shows a display example upon resizing with scrolling. With this display control, respective objects move to the right upon resizing, and their movement stops when these objects are brought into contact with the opposing border. In this case, the objects are displayed to overlap each other near the opposing border.
- FIG. 17B shows a display example upon resizing without scrolling.
- this display control since scrolling is not made, all objects are displayed without moving their position at the beginning of dragging of the border. However, when the dragged border moves to the right and is brought into contact with respective objects, these objects begin to move to the right. In this case, the objects are displayed to overlap each other near the dragged border. As the overlapping order, a newly stopped object may be displayed in front of or behind a preexistent object.
- display control as if objects attached to a window were being scooped by a wiper can be implemented, and objects which are normally hidden are displayed although imperfectly, thus improving the usability.
- FIGS. 18A and 18B show display examples upon resizing a window by dragging one corner of the window.
- FIG. 18A shows a display example upon resizing with scrolling
- FIG. 18B shows that upon resizing without scrolling.
- the respective operations have the same contents described using FIGS. 17A and 17B for X and Y components.
- FIG. 19A shows a display example upon resizing with scrolling.
- the following display control is executed. That is, respective objects move to the right upon resizing, and their movement stops when respective objects are brought into contact with the opposing border.
- the movement of that object stops at that time.
- objects are displayed not to overlap each other unlike in FIG. 17B .
- FIG. 19B shows a case upon resizing without scrolling.
- the following display control is executed. That is, all objects stand still initially. When the dragged border moves to the right and is brought into contact with respective objects, these objects begin to move to the right. In addition, when the objects which have already begun to move are brought into contact with other objects, the other objects begin to move at that time. As a result, objects are displayed not to overlap each other unlike in FIG. 17B .
- FIG. 20 is a flowchart showing an example of the window resizing processing corresponding to the display examples shown in FIGS. 17A and 17B .
- the processing corresponding to the flowchart shown in FIG. 20 is implemented when a CPU 101 reads out a corresponding processing program stored in an HD 103 onto a RAM 102 and executes that program to control respective components.
- the CPU 101 determines in step S 2001 whether or not the user begins to drag a border. If the user begins to drag the border (“YES” in step S 2001 ), the process advances to step S 2002 .
- the CPU 101 determines in step S 2002 if scrolling is ON simultaneously with resizing of a window by dragging. If it is determined that scrolling is OFF (“NO” in step S 2002 ), the process advances to step S 2003 ; otherwise (“YES” in step S 2002 ), the process advances to step S 2005 . Note that ON/OFF of scrolling can be determined according to the processes described in the first to fourth embodiments.
- a display area of an object O is expressed by O ⁇ (O 1 x , O 1 y ), (O 2 x , O 2 y ) ⁇ .
- (O 1 x , O 1 y ) represents the coordinates of the upper left end of the object
- (O 2 x , O 2 y ) represents the coordinates of the lower right end of the object.
- the left direction corresponds to a negative direction of the X-axis on an X-Y coordinate system 502 set in association with the display screen
- the up direction corresponds to a positive direction of the Y-axis.
- ⁇ O( ⁇ O 1 x , ⁇ O 2 x ) be a change in display area O in the X-axis direction.
- the CPU 101 determines in step S 2003 whether or not there is an object which is in contact with the dragged border. This determination process can be attained by comparing the coordinates of the display position of the object, and those of the dragged border. At this time, when the X-coordinate Bx of the dragged border falls within a range O 1 x ⁇ Bx ⁇ O 2 x , it can be considered that the object is in contact with the dragged border. Note that since the flowchart of FIG. 20 assumes the case of FIGS. 17A and 17B , that is, the case of dragging the border in the X direction, only the coordinate in the X-axis direction is considered. In addition, when a border also moves in the Y direction, whether or not an object is in contact with the dragged border can be determined by seeing whether or not the position By of the border in the Y direction falls within the range of that object.
- step S 2003 If it is determined that there is an object that is in contact with the dragged border (“YES” in step S 2003 ), the process advances to step S 2004 . On the other hand, if it is determined that there is no object that is in contact with the dragged border (“NO” in step S 2003 ), the process jumps to step S 2007 .
- step S 2005 If scrolling is executed simultaneously with dragging of the border, the CPU 101 determines in step S 2005 whether or not there is an object that is in contact with the border opposite to the dragged border.
- step S 2005 If it is determined that there is an object that contacts the opposing border (“YES” in step S 2005 ), the process advances to step S 2006 . On the other hand, if it is determined that there is no object that contacts the opposing border (“NO” in step S 2005 ), the process jumps to step S 2007 .
- step S 2007 the CPU 101 updates display of the object which is in contact with the border based on the moving amount of the object determined in step S 2004 or S 2006 .
- the CPU 101 updates display of other objects according to ON/OFF of scrolling based on the determination result in step S 2002 .
- step S 2008 determines in step S 2008 whether or not the user ends dragging. If it is determined that the user ends dragging (“YES” in step S 2008 ), this processing ends. On the other hand, if it is determined that the user does not end dragging (“NO” in step S 2008 ), the process returns to step S 2002 to continue the processing.
- FIGS. 17A and 17B By extending the aforementioned processing also in the Y direction, the display control corresponding to FIGS. 18A and 18B can be implemented.
- the display control of objects within a display area can be implemented based on the presence/absence of a contact with the boundary or border in the same manner as described above.
- the above-described exemplary embodiments of the present invention can also be achieved by providing a computer-readable storage medium that stores program code of software (computer program) which realizes the operations of the above-described exemplary embodiments, to a system or an apparatus. Further, the above-described exemplary embodiments can be achieved by program code (computer program) stored in a storage medium read and executed by a computer (CPU or micro-processing unit (MPU)) of a system or an apparatus.
- a computer CPU or micro-processing unit (MPU)
- the computer program realizes each step included in the flowcharts of the above-mentioned exemplary embodiments.
- the computer program is a program that corresponds to each processing unit of each step included in the flowcharts for causing a computer to function.
- the computer program itself read from a computer-readable storage medium realizes the operations of the above-described exemplary embodiments, and the storage medium storing the computer program constitutes the present invention.
- the storage medium which provides the computer program can be, for example, a floppy disk, a hard disk, a magnetic storage medium such as a magnetic tape, an optical/magneto-optical storage medium such as a magneto-optical disk (MO), a compact disc (CD), a digital versatile disc (DVD), a CD read-only memory (CD-ROM), a CD recordable (CD-R), a nonvolatile semiconductor memory, a ROM and so on.
- a floppy disk a hard disk
- a magnetic storage medium such as a magnetic tape
- an optical/magneto-optical storage medium such as a magneto-optical disk (MO), a compact disc (CD), a digital versatile disc (DVD), a CD read-only memory (CD-ROM), a CD recordable (CD-R), a nonvolatile semiconductor memory, a ROM and so on.
- an OS or the like working on a computer can also perform a part or the whole of processes according to instructions of the computer program and realize functions of the above-described exemplary embodiments.
- the CPU jointly executes each step in the flowchart with a memory, hard disk, a display device and so on.
- the present invention is not limited to the above configuration, and a dedicated electronic circuit can perform a part or the whole of processes in each step described in each flowchart in place of the CPU.
Abstract
Description
- 1. Field of the Invention
- The present invention relates to an information processing apparatus and control method thereof, and a computer program.
- 2. Description of the Related Art
- Conventionally, an information processing system, which can simultaneously execute a plurality of applications with user interfaces, can display a plurality of windows corresponding to the applications at the same time, and can control the respective windows to serve as independent user interfaces.
- In such a case, the information processing system can display the plurality of windows by one of the following methods. A method of overlapping the windows at arbitrary locations according to the rule of a predetermined priority order upon displaying the respective windows is available (overlap method). Also, a method of tiling the windows without overlapping each other upon displaying the respective windows is available (tiling window method). In general, when many windows need to be displayed within a limited display screen, the overlap method is more effective.
- Most of windows allow modification of their sizes and locations in the X and Y directions independently or simultaneously. When the overlap method is used, the windows need to be moved or resized to avoid completely covered windows as a result of window overlap.
- When a plurality of applications run in parallel, and corresponding windows are displayed at the same time, a display controller of an information processing apparatus displays a window to be prioritized or a window selected by the user for access in front of all other windows in each case. The whole area of the window displayed in front of all other windows is displayed, and partial areas of other windows are displayed based on their overlapping states.
- However, in this situation, when the user wants to frequently access a window hidden by other windows or to refer to its contents, the user needs to make a predetermined operation for windows. This operation includes that of switching display to locate a desired window in front of all other windows and that of downsizing or moving the windows located in front of the target window.
- In general, upon resizing a window (e.g., to reduce its size), it becomes difficult to display all the contents displayed before resizing within the resized window. For this reason, only partial contents to be prioritized are displayed. The sequence for determining such part to be prioritized is executed either automatically or manually.
- A window is resized by dragging one border or corner of the window. The window is moved by dragging a specific region which is not used for resizing.
- Upon resizing a window, there is a specification prepared in advance for each window type, and display control upon resizing is performed based on the specification. More specifically, a specification that moves the display contents upon dragging when a window is resized by dragging one border or corner is available. Also, a specification that does not move the display contents irrespective of dragging is available. Furthermore, a specification that moves the display contents to have a predetermined ratio with respect to dragging or reduces or modifies them is available.
- These specifications are determined in advance for respective window types or for respective places to be dragged even on one window. Note that in the present specification, moving the display contents of a window upon resizing the window will be referred to as “scrolling”.
- A general display control method upon resizing a window will be described below.
-
FIG. 21 shows the configuration of a window to be displayed on a display device.FIG. 21 shows a window that displays a document.FIGS. 22A to 22D andFIGS. 23A to 23D are explanatory views of popular display control methods upon resizing a window. -
FIGS. 22A to 22D are views showing cases in which the window shown inFIG. 21 is resized by dragging one of the four borders. - In general, upon resizing the window by moving the right or bottom border of the four borders, the display contents near the border opposite to the border to be moved remain unchanged, and those near the border to be moved are changed.
FIGS. 22A and 22C show the cases in which the window size is reduced by moving the right or bottom border. In these cases, the display contents near the border to be moved are gradually hidden. - Upon resizing the window by moving the left or top border of the four borders, the display contents near the border to be moved remain unchanged, and those near the border opposite to the border to be moved are changed.
FIGS. 22B and 22D show the cases in which the window size is reduced by moving the left or top border. In these cases, the display contents near the right or bottom border opposite to the border to be moved are gradually hidden. -
FIGS. 23A to 23D show cases in which the window shown inFIG. 21 is resized by dragging the corners of the window. Note that the corners of the window mean the intersections of the respective borders that define the window. - As shown in
FIGS. 23A to 23D , when the window is resized by dragging the upper left, upper right, lower left, and lower right corners of the window, the display contents near the upper left corner remain unchanged, and those near other corners are gradually hidden. - The concept of the display control shown in
FIGS. 22A to 22D andFIGS. 23A to 23D is to basically preferentially display the left and up directions of the display contents of a window. On the other hand, many windows which aim at the drawing function and display of general figures do not always preferentially display the left and up directions, and different specifications are determined in advance for respective window types. - Many specifications associated with resizing of a window are designed to naturally locate the contents to be prioritized at a display position if the user normally makes a resizing operation. However, a part that the user wants to display does not always move to the display position, and an operation for individually shifting the position of the display contents after resizing is often required.
- Most windows have scroll bars to shift the position of the display contents. In general, the user can move the contents that the user wants to display or access to the position within the window by operating the scroll bar.
- The inventions that improve the operations for resizing a window by dragging, for example, a predetermined part of the window are disclosed in Japanese Patent Nos. 2765615 and 3431795.
- On the other hand, a certain window often configures parent and child windows defined by predetermined specifications so as to prevent related windows from being uneasy to see due to overlap display or to prevent correspondence between the related windows from confusing.
- The inventions that relate to a method of controlling the relationship between the parent and child windows upon resizing a window are disclosed in Japanese Patent Laid-Open No. 9-185480 and Japanese Patent No. 3586351.
- In order to resize (especially, reduce) a window and to preferentially display a desired part, use of the display control specification determined in advance for each window type does not suffice. In many cases, the user needs to perform two operations step by step in such a manner that the user is required to scroll the display contents by a predetermined amount in a predetermined direction after resizing. Such requirement results in inefficiency upon making various operations on a computer, thus decreasing productivity accordingly.
- Embodiments of the present invention provides a technique that allows the user to arbitrarily and intuitively perform an operation for moving a desired part to be prioritized to a predetermined location concurrently with resizing during resizing a window.
- According to an exemplary embodiment of the present invention, there is provided an information processing apparatus comprising, display unit configured to display a window, accepting unit configured to accept a resize instruction of the displayed window together with a scroll instruction indicating whether or not to scroll display contents within the window, and control unit configured to control a size of the window and a scrolling of the display contents within the window based on contents of the resize instruction and the scroll instruction, wherein when the scroll instruction indicates that the display contents are to be scrolled, the control unit changes the window to a size indicated by the resize instruction, and scrolls the display contents according to a change amount of the window, and when the scroll instruction indicates that the display contents are not to be scrolled, the control unit changes the window to a size indicated by the resize instruction, and suppresses a scrolling of the display contents.
- According to another exemplary embodiment of the present invention, there is provided a method of controlling an information processing apparatus comprising, displaying a window on a display unit, accepting a resize instruction of the displayed window together with a scroll instruction indicating whether or not to scroll display contents within the window, and controlling a size of the window and a scrolling of the display contents within the window based on contents of the resize instruction and the scroll instruction, wherein when the scroll instruction indicates that the display contents are to be scrolled, the window is changed to a size indicated by the resize instruction, and the display contents are scrolled according to a change amount of the window, and when the scroll instruction indicates that the display contents are not to be scrolled, the window is changed to a size indicated by the resize instruction, and scrolling of the display contents is suppressed.
- Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
-
FIG. 1A is a block diagram showing an example of the hardware arrangement of an information processing apparatus according an embodiment of the invention; -
FIG. 1B shows an example of the arrangement of a mouse as an example of anoperation unit 109 according the embodiment of the invention; -
FIG. 1C shows an example of the arrangement of a digital pen and tablet as an example of theoperation unit 109 according the embodiment of the invention; -
FIG. 2 shows an example of the configuration of a window according the embodiment of the invention; -
FIGS. 3A to 3D show display examples when the user locates a cursor on a first or second region of atop border 201 orbottom border 202 of a window and drags it according to the first embodiment of the invention; -
FIGS. 4A to 4D show display examples when the user locates the cursor on a first or second region of aleft border 203 orright border 204 of a window and drags it according to the first embodiment of the invention; -
FIG. 5A is a view for explaining ON/OFF switching of scrolling upon resizing according to the first embodiment of the invention; -
FIG. 5B is a view for explaining ON/OFF switching of scrolling upon resizing when the cursor position is changed from the display state ofFIG. 5A ; -
FIG. 6 is a flowchart showing an example of window resizing processing according to the first embodiment of the invention; -
FIG. 7A shows an example of a state in which the size of awindow 200 matches that of awhole display screen 700 according to the second embodiment of the invention; -
FIG. 7B shows an example of a state in which the size of thewindow 200 changes when the user locates acursor 701 on asecond region 203 b and drags it in the X direction according to the second embodiment of the invention; -
FIG. 7C shows an example of a state in which the size of thewindow 200 changes when the user locates thecursor 701 on afirst region 203 a and drags it in the X direction according to the second embodiment of the invention; -
FIG. 8A shows an example of a state in which the size of thewindow 200 matches that of thewhole display screen 700 according to the second embodiment of the invention; -
FIG. 8B shows an example of a state in which the size of thewindow 200 changes when the user locates thecursor 701 on asecond region 202 b and drags it in the Y direction according to the second embodiment of the invention; -
FIG. 8C shows an example of a state in which the size of thewindow 200 changes when the user locates thecursor 701 on afirst region 202 a and drags it in the Y direction according to the second embodiment of the invention; -
FIG. 9A shows an example of a state before the beginning of dragging when the user locates a cursor P on a corner 209 (P0) in a display control method according to the third embodiment of the invention; -
FIG. 9B shows an example of a state in which the user moves the cursor P from the position P0 on thecorner 209 to a position P1 in the display control method according to the third embodiment of the invention; -
FIG. 9C shows an example of a state in which the user moves the cursor P from P1 to P2 in the display control method according to the third embodiment of the invention; -
FIG. 10 is a flowchart showing an example of window resizing processing according to the third embodiment of the present invention; -
FIG. 11 shows an example of a window including a plurality of sub-windows; -
FIG. 12 is a view for explaining the fourth embodiment of the invention taking as an example a window which is divided into left and right sub-windows as first and second sub-windows; -
FIG. 13 shows an example of a change in display contents when the user moves a boundary in a window divided by one boundary according to the fourth embodiment of the invention; -
FIG. 14 is a flowchart showing an example of window resizing processing according to the fourth embodiment of the invention; -
FIGS. 15A and 15B show division examples of boundaries; -
FIG. 16 shows an example of display contents of a window according to the fifth embodiment of the invention; -
FIGS. 17A and 17B show display examples when the user resizes (reduces) the window by dragging one border of the window according to the fifth embodiment of the invention; -
FIGS. 18A and 18B show display examples when the user resizes the window by dragging one corner of the window according to the fifth embodiment of the invention; -
FIGS. 19A and 19B show display examples that allow a normally hidden part to be easier to see according to the fifth embodiment of the invention; -
FIG. 20 is a flowchart showing an example of window resizing processing corresponding to the display examples shown inFIGS. 17A and 17B ; -
FIG. 21 shows the configuration of a window displayed on a display device; -
FIGS. 22A to 22D show cases in which the user resizes the window shown inFIG. 21 by dragging one of four borders; and -
FIGS. 23A to 23D show cases in which the user resizes the window shown inFIG. 21 by dragging one of four corners of the window. - Embodiments of the invention will be described hereinafter with reference to the accompanying drawings.
- The present invention provides a technique which arbitrarily controls, concurrently with dragging, whether or not to scroll the display contents of a window in response to dragging upon resizing the window by dragging an element (border, corner, boundary, etc.) which configures the window. In particular, the present invention proposes the following three control techniques.
- The first control technique covers a case in which the user resizes a window by mainly dragging one border of the window. This technique is characterized in that a direction component, which is not directly related to resizing, of those of the cursor motion upon movement is used in control. In corresponding embodiments, two different regions are formed on each border of a window, and ON/OFF of scrolling can be controlled concurrently with dragging by selecting that region while dragging.
- The second control technique is characterized in that ON/OFF of scrolling is controlled by operating a button other than that for dragging of an operation unit upon making a drag movement. This technique can be applied to both a case of dragging a corner and that of dragging a border.
- The third control technique executes control by cooperating the first and second control techniques. With this technique, on a window including a plurality of sub-windows, each sub-window is resized by dragging a boundary of the sub-window. In case of the window including the plurality of sub-windows, since each boundary is independently controlled, this technique can also be applied to a window including many sub-windows.
- The first embodiment of the invention will be described hereinafter. This embodiment will explain an embodiment that relates to the first control technique.
-
FIG. 1A is a block diagram showing an example of the hardware arrangement of an information processing apparatus used to implement the present invention. Referring toFIG. 1A , aCPU 101 executes an OS, application programs, and the like stored in an HD (hard disk) 103, and controls to temporarily store information, files, and the like required for execution of the programs in aRAM 102. TheRAM 102 serves as a main memory, work area, and the like of theCPU 101. TheHD 103 stores the application programs, driver programs, the OS, control programs, a processing program required to execute processing according to this embodiment, and the like. - A
display unit 104 displays information according to commands input from anoperation unit 109, externally acquired information, and the like. Thedisplay unit 104 may adopt any display method of CRT type, liquid crystal type, PDP type, SED type, and organic EL type. Thedisplay unit 104 displays a window according to this embodiment. A network interface (to be referred to as “I/F” hereinafter) 105 is a communication interface used to connect a network. AROM 106 stores programs such as a basic I/O program and the like. - An
external storage drive 107 can load programs and the like stored in a medium 108 to this computer system. The medium 108 as a storage medium stores predetermined programs and related data. Theoperation unit 109 is a user interface used to accept operations and instructions from an operator of this apparatus, and comprises a keyboard, mouse, digital pen, and the like. Asystem bus 110 controls the flow of data in the apparatus. - Note that a mouse, digital pen, and tablet as examples of the
operation unit 109 can have the arrangements shown in, for example,FIGS. 1B and 1C . In this case, the mouse and tablet are connected to aninformation processing apparatus 100 using USB connections, and can serve as theoperation unit 109. - A
mouse 120 shown inFIG. 1B can constitute a part of theoperation unit 109. Themouse 120 has theleft button 121 and theright button 122. Although not shown, the bottom surface of themouse 120 comprises a structure for detecting a moving amount and direction of themouse 120 using a mechanical mechanism using a ball or an optical mechanism using an optical sensor. - A
digital pen 130 andtablet 140 shown inFIG. 1C can constitute a part of theoperation unit 109. Thedigital pen 130 can comprise atip switch 131 at the pen tip, and aside switch 132 on the side surface. Thetip switch 131 corresponds to theleft button 121 of themouse 120, and theside switch 132 corresponds to theright button 122 of themouse 120. Thetip switch 131 can be turned on by pressing it against thetablet 140. Theside switch 132 can be turned on when the operator holds it down with the finger. - The
tablet 140 comprises a pressure-sensitive or electrostatic contact sensor, and can detect the position of thedigital pen 130 when the tip of thedigital pen 130 is pressed against thetablet 140. When the operator moves thedigital pen 130 while pressing the tip against thetablet 140, thetablet 140 can detect the moving direction and amount of thedigital pen 130. Note that thetablet 140 may be integrated with thedisplay unit 104. - An example of the configuration of a window according to the embodiment of the invention will be described below with reference to
FIG. 2 .FIG. 2 shows an example of the configuration of a window according to the embodiment of the invention. - Referring to
FIG. 2 , awindow 200 has a rectangular shape, and is defined by four borders, that is, atop border 201,bottom border 202, leftborder 203, andright border 204. Thewindow 200 has fourcorners corner 207 is defined as an intersection between thetop border 201 and leftborder 203, thecorner 208 is defined as an intersection between theleft border 203 andbottom border 202, thecorner 209 is defined as an intersection between thebottom border 202 andright border 204, and thecorner 210 is defined as an intersection between theright border 204 andtop border 201. - In this embodiment, each border is divided into two different regions, that is, first and second region. More specifically, the first region is located to include the center of the border, and the second region is located to include the end portions of the border, and to sandwich the first region. For example, on the
top border 201, a first region 201 a including the center of the border is located to be sandwiched betweensecond regions 201 b including the end portions of the border. - As the division method of the first and second region, each border may be equally divided into three or the first region may be slightly longer or shorter than the length obtained when the border is equally divided into three regions. This embodiment will exemplify a case in which one border is equally divided into three regions.
- The
window 200 includes a title bar 205 anddisplay area 206. The title bar 205 displays information corresponding to the content displayed in thedisplay area 206. For example, when thedisplay area 206 displays document data, the title bar 205 displays a document name. Thedisplay area 206 displays the contents of data to be displayed. Thedisplay area 206 displays the contents of a document for a document file, or displays a corresponding image or graphic information for an image or graphic file. - In this embodiment, the window can be resized by dragging one of the four borders of the window based on the operation of the
operation unit 109, and moving the selected border in a direction perpendicular to that border. That is, in this embodiment, the drag operation corresponds to a window resize instruction operation. As will be described in detail below, the present invention is characterized in that the window resize instruction, including a scroll instruction indicating whether or not to scroll the display contents within the window, is accepted. - Note that this embodiment uses “drag” as a term that represents the concept to be described below. A case will be examined first wherein the mouse shown in
FIG. 1B is used as theoperation unit 109 to have default settings of Microsoft Windows®. In this case, the display position of a cursor displayed on the screen of thedisplay unit 104 is controlled in response to the movement of themouse 120. When the user presses theleft button 121 while the cursor is located on a target to be selected, that target to be selected is highlighted. In this embodiment, moving the cursor by moving themouse 120 in this state will be referred to as “dragging”. - A case will be examined below wherein the
digital pen 130 andtablet 140 shown inFIG. 1C are used as theoperation unit 109. In this case, the display position of the cursor displayed on the screen of thedisplay unit 104 is controlled in response to the position of thedigital pen 130 pressed against thetablet 140. When the user presses thedigital pen 130 against thetablet 140 at a position corresponding to the display position of a target to be selected, the target to be selected is highlighted. In this embodiment, moving the cursor by moving thedigital pen 130 on thetablet 140 in this state will be referred to as “dragging”. -
FIGS. 3A to 3D show display examples according to this embodiment when the user drags the cursor while locating it on the first or second region of thetop border 201 orbottom border 202 of the window in the display state ofFIG. 2 .FIGS. 4A to 4D show display examples according to this embodiment when the user drags the cursor while locating it on the first or second region of theleft border 203 orright border 204 of the window in the display state ofFIG. 2 . Note that a frame indicated by the dotted line in each figure represents a frame corresponding to thewindow 200 inFIG. 2 before resizing. - As shown in
FIGS. 3A and 3C andFIGS. 4A and 4C , when the user drags the cursor while the user locates it on the first region (region - In
FIGS. 3A and 3C andFIGS. 4A and 4C , it can also be considered as if the display contents were moving in correspondence with the movement of the border. In this embodiment, such change in display contents will be referred to as “resizing with scrolling”. Also, a state in which the display contents of thedisplay area 206 are moved and displayed in correspondence with the movement of the border will be referred to as “with scrolling”, “the display contents are scrolled”, or “scrolling the display contents”. - As shown in
FIGS. 3B and 3D andFIGS. 4B and 4D , when the user drags the cursor while the user locates it on the second region (201 b, 202 b, 203 b, or 204 b), the display contents near a border (first border) where the cursor is located are changed. More specifically, the display contents are changed so as to be hidden in turn by the first border. On the other hand, the display contents near a border (second border) opposite to the border (first border) where the cursor is located remain unchanged. - In
FIGS. 3B and 3D andFIGS. 4B and 4D , it can also be considered as if the display contents are fixed with respect to the movement of the border. In this embodiment, such change in display contents will be referred to as “resizing without scrolling”. A state in which the display contents on thedisplay area 206 are fixedly displayed with respect to the whole display screen will be referred to as “without scrolling”, “the display contents are not scrolled”, or “not scrolling the display contents”. - In this embodiment, “resizing with scrolling” and “resizing without scrolling” can be executed during resizing in a continuous drag operation. That is, the resizing with scrolling and that without scrolling can be switched in real time during a continuous, single drag operation. Hence, the user can resize the window while adjusting the display position.
- Switching between the resizing with scrolling and that without scrolling will be described below with reference to
FIGS. 5A and 5B .FIGS. 5A and 5B are views for explaining that switching according to this embodiment. - A case will be examined below wherein the user reduces the
window 200 by dragging theleft border 203 of the window and moving it in a direction of an arrow 501 (right direction), as shown inFIG. 5A . Note that the width and height directions of thewindow 200 respectively match the X and Y directions of an X-Y coordinatesystem 502 set on the display screen where thewindow 200 is displayed. P0 represents an initial position of the cursor. -
FIG. 5B expresses a state in which a position P(Px, Py) of the cursor is continuously changed like P0→P1→P2→P3 or P5→P6→P7→P8 during a single drag operation. Note that P(Px, Py) is a coordinate value based on the X-Y coordinatesystem 502 set on the display screen. - In
FIG. 5B , let Ly be the length of theleft border 203, and C(Cx, Cy) be the position of thecorner 208 corresponding to the lower end of theleft border 203 to be dragged. Note that Cx corresponds to the position of theleft border 203 in the X direction. Also, let Q(Qx, Qy) be the position of arbitrary display contents on thedisplay area 206 of thewindow 200. Note that the respective coordinates are based on the aforementioned X-Y coordinatesystem 502. - While dragging the
left border 203, since the position Cx of theleft border 203 in the X direction follows an X component of the cursor position (it is not related to a Y component), it can be expressed by: -
Cx=Px (1) - From equation (1), since the cursor is kept located on the left border during dragging, a condition required to locate the cursor on the first region is described by:
-
Ly/3≦Py−Cy<2Ly/3 - Likewise, a condition required to locate the cursor on one of the
second regions 203 b of theleft border 203 is described by: -
0<Py−Cy<Ly/3 or 2Ly/3<Py−Cy<Ly - Therefore, upon making a drag operation while the cursor is located on the second region to attain the resizing without scrolling, this process can be expressed in association with the point Q by:
-
ΔQx =0 (2) - where ΔQx is a difference between Qx at the beginning of the resizing without scrolling, and Qx after the window is resized.
- Likewise, upon making a drag operation while the cursor is located on the first region to attain the resizing with scrolling, this process can be expressed in association with the position Q by:
-
ΔQx=ΔCx=ΔPx (3) - where ΔQx is a difference between Qx at the beginning of the resizing with scrolling, and Qx after the window size is resized. Likewise, ΔCx and ΔPx are differences between Cx and Px at the beginning of the resizing with scrolling, and Cx and Px after the window is resized. Note that these differences correspond to change amounts of the
window 200 in the X direction. - Upon application of the above concept to
FIG. 5B , when the cursor position falls within a range from P0 to P1, and from P2 to P3, since the cursor belongs to thesecond region 203 b, ΔQx=0, and the resizing without scrolling is executed. When the cursor position falls within a range from P1 to P2, since the cursor belongs to thefirst region 203 a, ΔQx=ΔPx, and the resizing with scrolling is executed. - That is, while the cursor begins to be dragged from P0 and is continuously dragged to be moved to P3, the position of the
left border 203 of thewindow 200 moves from C0 to C3 according to the X component of the cursor, thus resizing the window. During this operation, the “resizing with scrolling” and “resizing without scrolling” are executed concurrently according to a change in position of the cursor in the Y direction. - The same applies to a case in which the user upsizes the window by moving the cursor position like P5→P6→P7→P8. That is, when the cursor position falls within a range from P5 to P6 and from P7 to P8, since the cursor belongs to the
second region 203 b, ΔQx=0, and the resizing without scrolling is executed. When the cursor position falls within a range from P6 to P7, since the cursor belongs to thefirst region 203 a, ΔQx=ΔPx, and the resizing with scrolling is executed. - That is, while the cursor begins to be dragged from P5 and is continuously dragged to be moved to P8, the position of the
left border 203 of thewindow 200 moves from C3 to C0 according to the X component of the cursor, thus resizing the window. During this operation, the “resizing with scrolling” and “resizing without scrolling” are executed concurrently according to a change in position of the cursor in the Y direction. - Note that the case has been exemplified in
FIG. 5B wherein theleft border 203 is dragged. Also, the same applies to the case wherein the top, bottom, and right borders (borders - The sequence of the aforementioned window resizing processing will be described below with reference to the flowchart of
FIG. 6 .FIG. 6 is a flowchart showing an example of the window resizing processing according to the first embodiment. The processing corresponding to the flowchart shown inFIG. 6 is implemented when theCPU 101 reads out a corresponding processing program stored in theHD 103 onto theRAM 102 and executes that program to control respective components. - Note that
FIG. 6 describes a case wherein the user resizes the window by dragging theleft border 203 of thewindow 200. However, the embodiment of the invention is not limited to the case wherein theleft border 203 is dragged. That is, the same processing as inFIG. 6 can resize the window by dragging thetop border 201,bottom border 202, andright border 204. - In step S601, the
CPU 101 acquires operation information (information of a first instruction operation) of a first button of themouse 120 ordigital pen 130 of theoperation unit 109, and information (moving information) of the moving direction and amount of themouse 120 ordigital pen 130. Note that the first button (first operation unit) corresponds to theleft button 121 of themouse 120 if themouse 120 is used in the default settings of Microsoft Windows®. Also, the first button corresponds to thetip switch 131 at the pen tip of thedigital pen 130. - The
CPU 101 determines in step S602 based on the operation information of the first button acquired in step S601 whether or not the first button is switched from OFF to ON. If it is determined that the first button is switched to ON (“YES” in step S602), the process advances to step S603. On the other hand, if it is determined that the first button is kept OFF without being switched to ON (“NO” in step S602), the process returns to step S601 to continue the processing. - In step S603, the
CPU 101 calculates the position coordinate of the cursor (cursor position coordinate) based on the moving amount information acquired in step S601 to determine on which border of thewindow 200 the cursor is located. This determination process can be attained by seeing which of predetermined regions set based on the first and second regions of the borders that configure thewindow 200 includes the cursor position coordinate. - If it is determined that the cursor is located on the
left border 203 of the window 200 (“left border” in step S603), it can be determined that the user begins to drag theleft border 203. In this case, the process advances to step S604. On the other hand, if the cursor is located on one of the remaining borders (on one of thetop border 201,bottom border 202, and right border 204) (“another border” in step S603), it can be determined that the user begins to drag another border. In this case, the process advances to step S605. In step S605, theCPU 101 executes window resizing processing by dragging of another border. - In step S604, the
CPU 101 determines the cursor position coordinate P(Px, Py) at the beginning of dragging, as shown inFIG. 5A , for the window which begins to be dragged. Also, theCPU 101 determines the position C(Cx, Cy) of thecorner 208 at the lower end of theleft border 203 and the position Q(Qx, Qy) of the arbitrary display contents, as shown inFIG. 5B . - In step S606, the
CPU 101 further acquires the operation information of the first button and the moving amount information, and updates the cursor position coordinate P(Px, Py) based on the moving amount information. TheCPU 101 then determines in step S607 whether or not the first button is kept ON. If the first button is not kept ON but is switched to OFF (“NO” in step S607), this processing ends. In this case, a so-called “drop” operation is made. - On the other hand, if the first switch is kept ON (“YES” in step S607), the process advances to step S608. In step S608, the
CPU 101 sets the X position (Cx) of theleft border 203 of thewindow 200 to match the X component (Px) of the cursor position coordinate updated in step S606. In this way, the position of theleft border 203 follows the movement of the cursor in the X direction. - The
CPU 101 determines in step S609 based on the cursor position coordinate updated in step S606 whether or not the cursor is located on the first region. If it is determined that the cursor is located on the first region (“YES” in step S609), the process advances to step S610. On the other hand, if it is determined that the cursor is located on the second region (“NO” in step S609), the process advances to step S611. - In step S610, the
CPU 101 sets the moving amount ΔQx of the position Q of the arbitrary display contents in the X direction to be equal to the moving amount ΔPx of the cursor in the X direction, so as to scroll the display contents upon resizing the window. On the other hand, in step S611, theCPU 101 sets the moving amount ΔQx to be zero so as to suppress scrolling of the display contents upon resizing the window. - In step S612, the
CPU 101 updates display of the cursor andwindow 200 based on the position of theleft border 203 determined in step S608 and the moving amount ΔQx determined in step S610 or S611. After that, the process returns to step S606 to continue the processing. - Note that a loop from step S606 to step S612 represents cursor movement during dragging, that is, that dragging is continued and resizing of the window is in progress during this loop. When the control leaves this loop, this represents that the drop operation is made to settle the window size.
- As described above, according to this embodiment, since each border of the window is divided into two different regions, and the change method of the display contents within the window can be controlled based on the selected region. Since the region can be selected in real time during resizing of the window, the position of the display contents within the window can be controlled simultaneously with resizing. In this way, a desired display result can be obtained by a series of operations, thus improving the work efficiency.
- The second embodiment of the invention will be described hereinafter. This embodiment will explain an embodiment that extends the first control technique.
- Upon displaying a window on a
display unit 104, the following three display states are normally available: - 1. a display state in which both the height and width of the window are maximized to fit a whole display screen (so-called full screen display);
- 2. a display state in which only one icon or title is displayed in a small size (so-called minimum display); and
- 3. a display state in which the window occupies only a part of the display screen.
- The display states 1 and 3 will be compared. In case of the
display state 1, since the window itself is fixed, there is no trouble upon handling the window. However, in order to refer to another window, a switching operation for canceling the full screen display state is required. - On the other hand, in case of the
display state 3, there is a merit of allowing the user to refer to a plurality of windows, but it is troublesome since the sizes and locations of the respective windows need to be determined and organized. Especially, when a relatively large window completely covers a relatively small window, the user needs to move the upper window to an appropriate location to access the lower window, resulting in inconvenience. - In this embodiment, in order to allow use of a window of a type that considers the merits of both the display states, window display of the first embodiment is applied to so-called “full screen display”.
- As described above, in “full screen display”, a window is maximized in the X and Y directions of the display screen of the
display unit 104, and is fixed in size. The window cannot be resized unless the full screen display state is canceled. - By contrast, in the full screen display according to this embodiment, a window is maximized in only one of the X and Y directions within the display screen, and is fixed in size in that direction. In the remaining direction, one border is fixed to the end of the display screen, and only the other border is movable by dragging. By operating this border that can be dragged, the window can be resized in one direction.
-
FIGS. 7A to 7C show examples of full screen display according to this embodiment. InFIGS. 7A to 7C ,reference numeral 700 denotes a whole display screen of thedisplay unit 104. Since the window configuration is the same as that inFIG. 2 of the first embodiment, corresponding reference numerals will be used. Aleft border 203 of awindow 200 includesfirst region 203 a andsecond regions 203 b. The user can drag the first andsecond regions cursor 701. The directions of thewhole display screen 700 andwindow 200 are determined based on an X-Y coordinatesystem 502. -
FIG. 7A shows a state in which the size of thewindow 200 matches that of thewhole display screen 700. That is,FIG. 7A corresponds to the full screen display state. -
FIG. 7B shows a state in which thewindow 200 is resized when the user locates thecursor 701 on thesecond region 203 b and drags it in the X direction. By dragging in the X direction using thesecond region 203 b, the size of thewindow 200 changes in only the X direction. At this time, aright border 204 opposite to the dragged leftborder 203 is fixed to the end of the display area, and only theleft border 203 can be dragged. With this movement, the window is resized in one direction. Note that thewindow 200 is fixed in a maximum size in the Y direction perpendicular to the dragging direction. Note that in case ofFIG. 7B , since thesecond region 203 b is used, resizing without scrolling described in the first embodiment is executed. -
FIG. 7C shows a state in which the window is resized when the user locates thecursor 701 on thefirst region 203 a and drags it in the X direction. By dragging in the X direction using thefirst region 203 a, the size of thewindow 200 changes in only the X direction. At this time as well, theright border 204 opposite to the dragged leftborder 203 is fixed to the end of the display area, and only theleft border 203 can be dragged. With this movement, thewindow 200 is resized in one direction. Note that thewindow 200 is fixed in a maximum size in the Y direction perpendicular to the dragging direction. Note that resizing with scrolling described in the first embodiment is executed since thefirst region 203 a is used at this time. - In
FIGS. 7A to 7C , theleft border 203 is used as a border having a function of resizing the window. However, any of the remaining three borders which configure thewindow 200 may be used as a border having a function of resizing the window. For example,FIGS. 8A to 8C show a case using abottom border 202. That is,FIG. 8A shows an example of a state in which the size of thewindow 200 according to this embodiment matches that of thewhole display screen 700.FIG. 8B shows an example of a state in which thewindow 200 is resized when the user locates thecursor 701 on asecond region 202 b and drags it in the Y direction according to this embodiment.FIG. 8C shows an example of a state in which the window is resized when the user locates thecursor 701 on afirst region 202 a and drags it in the Y direction according to this embodiment. One and only difference betweenFIGS. 8A to 8C andFIGS. 7A to 7C is a border used to resize the window. - Note that in this embodiment, a border that is movable can also be referred to as a “movable border”, a border located at a position opposite to the movable border can also be referred to as a “first fixed border (opposing fixed border)”, and the remaining two borders can also be referred to as a “second fixed border” and “third fixed border”.
- In case of
FIGS. 7A to 7C , theleft border 203 corresponds to the movable border, theright border 204 corresponds to the first fixed border (opposing fixed border), and atop border 201 and thebottom border 202 respectively correspond to the second and third fixed borders. In case ofFIGS. 8A to 8C , thebottom border 202 corresponds to the movable border, thetop border 201 corresponds to the first fixed border (opposing fixed border), and the left andright borders - Note that the display position on a
display area 206 of thewindow 200 can be controlled in the same manner as in the first embodiment. However, an only difference is that the first and second regions given to all the four borders in the first embodiment are limited to only one border in this embodiment. - As described above, the window according to this embodiment is maintained in a maximized state in one of the X and Y directions (width and height directions). Therefore, upon reordering a plurality of windows, a one-dimensional positional relationship need only be considered. As a result, compared to reordering of windows in consideration of a two-dimensional positional relationship, an operation can be simplified very much, thus greatly eliminating complexity.
- Since the window can be resized, a window hidden below the upper window can be displayed compared to a case in which a window is completely maximized in both the X and Y directions, thus improving convenience.
- Also, such window can be defined as a fourth window display state in addition to the aforementioned window display states 1 to 3.
- Note that the point of this embodiment is not limited to that the window can be resized in one direction in the full screen display state, but it lies in that the display position of the display contents within the window can be controlled at the time of the drag operation in combination with the invention according to the first embodiment.
- The third embodiment of the invention will be described hereinafter. This embodiment will explain an embodiment which relates to the second control technique.
- The aforementioned first embodiment has proposed the display control method upon resizing the window by dragging one of the borders which configure the window. This method is effective in the case in which the window is often resized by mainly dragging the border. Especially, this method is very effective for the window which is maximized in only one direction, as described in the second embodiment.
- However, a normal window can be resized by dragging one of its corners, as shown in
FIGS. 23A to 23D . Whether each user drags the border or corner to resize such normal window depends on favor of the user, the display contents of individual applications, individual work contents, and the like. - This embodiment proposes a method that can control ON/OFF of scrolling during resizing in real time as in the first embodiment even upon resizing a window by dragging its corner.
- In the display control method according to the aforementioned first embodiment, upon resizing a window by dragging its border, ON/OFF switching of scrolling upon resizing is controlled based on the cursor position in the direction perpendicular to the dragging direction. However, upon resizing a window by dragging its corner, the cursor movement needs to be instructed two-dimensionally. That is, since both the X and Y components of the cursor movement get directly involved in the movement of the corner, one component of the cursor movement cannot be used in switching control between resizing with scrolling and that without scrolling.
- Hence, this embodiment uses ON/OFF of a second button of a
mouse 120 ordigital pen 130 of anoperation unit 109 in switching control between resizing with scrolling and that without scrolling upon resizing a window. Note that the second button (second operation unit) corresponds to aright button 122 of themouse 120 in the default settings of Microsoft Windows®. On the other hand, the second button corresponds to aside switch 132 on the side surface of thedigital pen 130. Also, the second button may be assigned to a specific key such as a control key. - The operation of the display control method according to this embodiment will be described below with reference to
FIGS. 9A to 9C .FIG. 9A shows a state before the beginning of dragging, in which the user locates a cursor P on a corner 209 (P0).FIG. 9B shows a state in which the user moves the cursor P from the position P0 to a position P1 of thecorner 209. Upon this cursor movement, the user turns on the second button to execute the resizing with scrolling. Furthermore,FIG. 9C shows a state in which the user moves the cursor P from P1 to P2. Upon this cursor movement, the user turns off the second button to execute the resizing without scrolling. - Note that a
dotted line 901 inFIGS. 9B and 9C indicates the size of awindow 200 before resizing. The contents within a dottedline 902 indicate the display contents falling outside thewindow 200 after resizing. - It should be noted that the first button is kept ON during dragging irrespective of ON/OFF of the second button.
-
FIGS. 9A to 9C are views for explaining the display control method of this embodiment by adopting the configuration of the window corresponding toFIG. 2 , but they omit descriptions of first and second regions for the sake of simplicity. Note that the third embodiment can be practiced in combination with the first embodiment, and this embodiment can be applied to the window shown inFIG. 2 , which has the first and second regions, just in case. - This embodiment can assure similar operations on any of four
corners 207 to 210 of thewindow 200, and the following description will be given taking as an example a case in which the user drags the lowerright corner 209. - In
FIGS. 9A to 9C , parameters are defined as follows. Let C(Cx, Cy) be the position of thecorner 209 of thewindow 200, B(Bx, By) be the position of a point corresponding to that immediately below the point C in an initial state of the display contents within the window, and Q(Qx, Qy) be the position of arbitrary display contents within the window. Note that respective coordinate values are based on an X-Y coordinatesystem 502 set with respect to the display screen. Assume that the position C changes like C0, C1, and C2, the position B changes like B0, B1, and B2, and the position Q changes like Q0, Q1, and Q2 in correspondence with the movement of the cursor position from P0 to P1 and to P2. - At the beginning of dragging, as shown in
FIG. 9A , the user locates the cursor position P at the position of the lowerright corner 209, and switches the first button from OFF to ON there. At this time, P0=C0=B0. - During the movement of the cursor position from P0 to P1 after the beginning of dragging in
FIG. 9B , thecorner 209 of thewindow 200 moves to follow the cursor, and the display contents within adisplay area 206 also move to follow the cursor (since they are scrolled). At this time, P1=C1=B1. That is, the relationship among P, C, B, and Q can be expressed by: -
ΔC(ΔCx, ΔCy)=ΔP(ΔPx, ΔPy) (4) -
ΔB(ΔBx, ΔBy)=ΔP(ΔPx, ΔPy) (5) -
ΔQ(ΔQx, ΔQy)=ΔP(ΔPx, ΔPy) (6) - where Δ indicates a change amount.
- Furthermore, during the movement of the cursor position from P1 to P2 in
FIG. 9C , thecorner 209 of the window similarly moves to follow the cursor P. However, the display contents within thedisplay area 206 do not follow the cursor movement since they are not scrolled in this case. At this time, P2=C2≠B2 (=B1). That is, the relationship among P, C, B, and Q can be expressed by: -
ΔC(ΔCx, ΔCy)=ΔP(ΔPx, ΔPy) (7) -
ΔB(ΔBx, ΔBy)=(0, 0) (8) -
ΔQ(ΔQx, ΔQy)=(0, 0) (9) - The sequence of the aforementioned window resizing processing will be described below with reference to the flowchart of
FIG. 10 .FIG. 10 is a flowchart showing an example of the window resizing processing according to the third embodiment. The processing corresponding to the flowchart shown inFIG. 10 is implemented when aCPU 101 reads out a corresponding processing program stored in anHD 103 onto aRAM 102 and executes that program to control respective components. - Note that
FIG. 10 describes a case in which the user resizes the window by dragging the lowerright corner 209 of thewindow 200. The embodiment of the invention is not limited to the case in which the lowerright corner 209 is dragged. That is, the same processing as inFIG. 10 can resize the window by dragging the upperleft corner 207, lowerleft corner 208, and upperright corner 210. - In step S1001, the
CPU 101 acquires operation information (information of a first instruction operation) of a first button of themouse 120 ordigital pen 130 of theoperation unit 109, and information (moving information) of the moving direction and amount of themouse 120 ordigital pen 130. Note that the first button corresponds to theleft button 121 of themouse 120 if themouse 120 is used in the default settings of Microsoft Windows®. Also, the first button corresponds to atip switch 131 at the pen tip of thedigital pen 130. - The
CPU 101 determines in step S1002 based on the operation information of the first button acquired in step S1001 whether or not the first button is switched from OFF to ON. If it is determined that the first button is switched to ON (“YES” in step S1002), the process advances to step S1003. On the other hand, if it is determined that the first button is kept OFF without being switched to ON (“NO” in step S1002), the process returns to step S1001 to continue the processing. - In step S1003, the
CPU 101 calculates the position coordinate of the cursor P (cursor position coordinate) based on the moving amount information acquired in step S1001 to determine on which corner of thewindow 200 the cursor is located. This determination process can be attained by seeing which of predetermined regions set based on the corners that configure thewindow 200 includes the cursor position coordinate. - If it is determined that the cursor is located on the lower
right corner 209 of the window 200 (“lowerright corner 209” in step S1003), it can be determined that the user begins to drag the lowerright corner 209. In this case, the process advances to step S1004. On the other hand, if the cursor is located on one of the remaining corners (on one of thecorners CPU 101 executes window resizing processing by dragging of another corner. - In step S1004, the
CPU 101 determines the position coordinates P(Px, Py), C(Cx, Cy), B(Bx, By), and Q(Qx, Qy) at the beginning of dragging, as shown inFIG. 9A , for the window which begins to be dragged. Note that the definitions of respective coordinates are the same as those described above. - In step S1006, the
CPU 101 further acquires the information of the first instruction operation and moving amount information, and also operation information of a second button (information of a second instruction operation) of themouse 120 ordigital pen 130 of theoperation unit 109. Also, theCPU 101 updates the cursor position coordinate P(Px, Py) based on the moving amount information. TheCPU 101 then determines in step S1007 whether or not the first button is kept ON. If the first button is not kept ON but is switched to OFF (“NO” in step S1007), this processing ends. In this case, a so-called “drop” operation is made. - On the other hand, if the first switch is kept ON (“YES” in step S1007), the process advances to step S1008. In step S1008, the
CPU 101 sets the position C(Cx, Cy) of the lowerright corner 209 of thewindow 200 to match the cursor position P(Px, Py) updated in step S1006. In this way, the position of the lowerright corner 209 follows the cursor movement. - The
CPU 101 determines in step S1009 based on the operation information of the second button acquired in step S1006 whether or not the second button is ON. If it is determined that the second button is ON (“YES” in step S1009), the process advances to step S1010. On the other hand, if it is determined that the second button is OFF (“NO” in step S1009), the process advances to step S1011. - In step S1010, the
CPU 101 sets the moving amount ΔQ(ΔQX, ΔQy) of the position Q of the arbitrary display contents to be equal to the moving amount ΔP(ΔPx, ΔPy) of the cursor. In this way, the display contents are scrolled by a size corresponding to the change amounts of thewindow 200 in the X and Y directions. On the other hand, in step S1011, theCPU 101 sets the moving amount ΔQ to be (0, 0). In this case, the display contents are not scrolled. - In step S1012, the
CPU 101 updates display of the cursor andwindow 200 based on the position of the lowerright corner 209 determined in step S1008 and the moving amount ΔQ determined in step S1010 or S1011. After that, the process returns to step S1006 to continue the processing. - Note that a loop from step S1006 to step S1012 represents cursor movement during dragging, that is, that dragging is continued and resizing of the window is in progress during this loop. When the control leaves this loop, this represents that the drop operation is made to settle the window size.
- As described above, according to this embodiment, the two different operation buttons of the
operation unit 109 are used, and the change method of the display contents within the window can be controlled based on combinations of the button operations. Since the combinations of the button operations can be changed in real time during resizing of the window, the position of the display contents within the window can be controlled simultaneously with resizing. In this way, a desired display result can be obtained by a series of operations, thus improving the work efficiency. - Note that the case has been explained wherein the window is resized by mainly dragging the corner of the window. However, the display control method according to this embodiment can be applied to a case wherein the window is resized by dragging its border. In this case, ON/OFF of scrolling upon resizing can be controlled by the same operations in case of dragging the corner and that of dragging the border.
- Note that the display control method (first control technique) according to the first embodiment and that (second control technique) according to this embodiment can be compared as follows.
- The first control technique is effective upon attaching importance to resizing by dragging a border, and is especially effective in case of the second embodiment. In consideration of only the case of dragging the border, the first control technique can achieve the desired resizing by a simpler operation than the second control technique.
- By contrast, the second control technique is effective for the case including probability of dragging of both the corner and border, and the case that also attaches importance to dragging of the corner. Using the second control technique, the desired resizing can be achieved by common operation to the case of dragging the corner and that of dragging the border.
- The fourth embodiment of the invention will be described hereinafter. This embodiment will explain an embodiment that relates to the aforementioned third control technique.
- This embodiment will explain display control of the present invention, which is applied to a case in which a window includes a plurality of sub-windows, and each sub-window is resized by dragging a boundary between the neighboring sub-windows.
- Some applications display using a window defined by a single area, and some other applications display using a window including a plurality of sub-windows.
FIG. 11 shows an example of the latter application. In this case, using the plurality of sub-windows, the display efficiency can be improved compared to a case of a single window, and a more comfortable user interface can be provided. - When a window includes a plurality of sub-windows, it is a common practice to resize each sub-window in the window by dragging a boundary between the neighboring sub-windows. At this time, in the conventional window configuration, ON/OFF of scrolling upon resizing needs to be determined in advance for each sub-window in case of resizing, or a scroll operation needs to be done after resizing.
- For example, the left or top part of the display contents in each sub-window is preferentially displayed in some cases. This is based on the same situation as a window defined by a single area, that is, the idea that the first character of a sentence and the first line of a page are to be preferentially displayed.
- Therefore, in an example of a window divided into left and right sub-windows, upon resizing the sub-windows by dragging a boundary, the display contents of the left sub-window are not scrolled, and those of the right sub-window are scrolled. Likewise, in an example of a window divided into upper and lower sub-windows, the display contents of the upper sub-window are not scrolled, and those of the lower sub-window are scrolled.
- By contrast, this embodiment provides a display control method that allows to concurrently switch ON/OFF of scrolling of sub-windows on two sides of a boundary in real time during resizing upon resizing by dragging the boundary. Hence, in this embodiment, the need for fixing ON/OFF of scrolling in advance can be obviated unlike in the related art.
- Display control processing according to this embodiment will be described below. In this embodiment, in order to control whether or not to scroll the display contents for each sub-window, the following four control modes are available. Note that a case will be examined below wherein a window includes a sub-window on the first side with respect to a boundary, and that on the second side.
-
Control mode 1. resizing with scrolling of both the sub-windows on the first and second sides -
Control mode 2. resizing with scrolling of the sub-window on the first side and that without scrolling of the sub-window on the second side -
Control mode 3. resizing without scrolling of the sub-window on the first side and that with scrolling of the sub-window on the second side -
Control mode 4. resizing without scrolling of both the sub-windows on the first and second sides - Note that the relationship between the sub-windows on the first and second sides can be considered as that between neighboring sub-windows on, for example, the left and right sides or the upper and lower sides of the boundary.
-
FIG. 12 is a view for explaining this embodiment taking as an example a window which is divided into left and right sub-windows as the first and second sub-windows. Note that the boundary that the user can drag is one boundary per drag operation, and the same display control applies to a window divided into upper and lower sub-windows as in that divided into the left and right sub-windows. - In
FIG. 12 , awindow 1200 is defined byborders boundaries - Each of the
boundaries FIG. 12 , the upper half region is called a first region, and the lower half region is called a second region. Note that the division method is merely an example, and is not limited to that shown inFIG. 12 . For example, the same division method of each border in the first embodiment may be adopted. - In
FIG. 12 , the position of a cursor P can be expressed by P(Px, Py) based on an X-Y coordinatesystem 502 set on the display screen on which thewindow 1200 is displayed. Let LBy be the length of theboundary 1205 within thewindow 1200, and BL(BLx, BLy) be the position of an intersection between the lower end of theboundary 1205 and thelower border 1202. Note that BLx corresponds to the position of theboundary 1205 in the X direction. - Note that a condition (condition 1) required to locate the cursor P on the first region is described by:
-
LBy/2≦Py−BLy≦LBy - Likewise, a condition (condition 2) required to locate the cursor P on the second region is described by:
-
0<Py−BLy<LBy/2 - Let QL(QLX, QLy) be the position of arbitrary display contents within the sub-window 1207 on the left side of the
boundary 1205, and QR(QRx, QRy) be the position of arbitrary display contents within the sub-window 1208 on the right side. Theboundary 1205 will be described below. However, the scroll control of the display contents upon resizing the sub-windows with reference to theboundary 1206 can be similarly executed. - Upon execution of resizing without scrolling of the sub-windows in case of a drag operation, the following expression can be made in association with the positions QL and QR:
-
ΔQLx=ΔQRx=0 (10) - where ΔQLx and ΔQRx are differences of QLx and QRx before and after resizing of the sub-windows.
- Likewise, upon execution of resizing with scrolling of the sub-windows, the following expression can be made in association with the positions QL and QR:
-
ΔQLx=ΔQRx=ΔPx (11) - where ΔQLx and ΔQRx are differences of QLx and QRx before and after resizing of the sub-windows. Likewise, ΔPx is a difference of Px before and after resizing of the sub-windows. Note that these differences correspond to the change amounts of the
boundary 1205 in the X direction. - In this embodiment, the four types of resizing control of the
control modes 1 to 4 are switched by combining dragging of the cursor which is located on either the first or second region, and ON/OFF of the second button operation. - In the
control mode 1, the cursor located on the first region is dragged, and the second button is ON. - In the
control mode 2, the cursor located on the first region is dragged, and the second button is OFF. - In the
control mode 3, the cursor located on the second region is dragged, and the second button is ON. - In the
control mode 4, the cursor located on the second region is dragged, and the second button is OFF. - In this way, the display control method according to this embodiment simultaneously uses control based on the position of the cursor in the Y direction used in the first and second embodiments, and control based on the second button of the
operation unit 109 used in the third embodiment in cooperation with each other. In case of any of the above four patterns, switching between resizing with scrolling and that without scrolling for each of the sub-windows on the two sides is controlled concurrently during the single, continuous drag operation and cursor movement. The start and continuation of dragging are controlled by ON/OFF of the first button of theoperation unit 109 as in the above embodiments. -
FIG. 13 shows an example of a change in display contents when the user moves a boundary on a window divided by the single boundary. - In
FIG. 13 ,reference numeral 1301 denotes a state before beginning of dragging. In this state, a left sub-window displays alphabetical letters “ABD”, and a right sub-window displays three rows of numerals “1” to “9”. - In this display state of the
window 1301, when the user locates the cursor on the second region, and drags it while the second button is OFF, a display state of awindow 1302 is set. At this time, since both the left and right sub-windows are not scrolled, letters “EE” hidden on the left sub-window are newly displayed. On the other hand, on the right sub-window, “1” and “2” are fully hidden and “3” is partially hidden by the movement of the boundary. - When the user locates the cursor on the first region and drags it while the second button is ON, a display state of a
window 1303 is set. Since both the left and right sub-windows are scrolled, the display contents near the boundary remain unchanged, but those near the left and right borders of the window are changed. - Furthermore, when the user locates the cursor on the second region and drags it while the second button is ON, a display state of a
window 1304 is set. At this time, only the right sub-window is scrolled. Hence, alphabetical letters “FG” hidden on the left sub-window are newly displayed near the boundary. On the other hand, on the right sub-window, numerals “1 2 3” near the right border of the window, which were displayed on thewindow 1303, are hidden. - Moreover, when the user locates the cursor on the first region and drags it while the second button is OFF, a display state like a
window 1305 is set. At this time, only the left sub-window is scrolled. Hence, on the left sub-window, alphabetical letters “AB” hidden near the left border of the window are displayed. On the other hand, since the right sub-window is not scrolled, numerals “3 4 5 6” are hidden by the boundary. - The sequence of the aforementioned window resizing processing will be described below with reference to the flowchart of
FIG. 14 .FIG. 14 is a flowchart showing an example of the window resizing processing according to the fourth embodiment. The processing corresponding to the flowchart shown inFIG. 14 is implemented when aCPU 101 reads out a corresponding processing program stored in anHD 103 onto aRAM 102 and executes that program to control respective components. - Note that
FIG. 14 describes a case in which the user resizes the sub-windows by dragging theboundary 1205 of thewindow 1200. However, the embodiment of the invention is not limited to the case in which theboundary 1205 is dragged. That is, the same processing as inFIG. 14 can resize the sub-windows by dragging theboundary 1206 or another boundary. - In step S1401, the
CPU 101 acquires operation information (information of a first instruction operation) of a first button of amouse 120 ordigital pen 130 of theoperation unit 109, and information (moving information) of the moving direction and amount of themouse 120 ordigital pen 130. Note that the first button corresponds to aleft button 121 of themouse 120 if themouse 120 is used in the default settings of Microsoft Windows®. Also, the first button corresponds to atip switch 131 at the pen tip of thedigital pen 130. - The
CPU 101 determines in step S1402 based on the operation information of the first button acquired in step S1401 whether or not the first button is switched from OFF to ON. If it is determined that the first button is switched to ON (“YES” in step S1402), the process advances to step S1403. On the other hand, if it is determined that the first button is kept OFF without being switched to ON (“NO” in step S1402), the process returns to step S1401 to continue the processing. - In step S1403, the
CPU 101 calculates the position coordinate of the cursor P (cursor position coordinate) based on the moving amount information acquired in step S1401 to determine on which boundary of thewindow 1200 the cursor is located. This determination process can be attained by seeing which predetermined region set based on the boundaries included in thewindow 1200 includes the cursor position coordinate. - If it is determined that the cursor is located on the
boundary 1205 of the window 1200 (“boundary 1205” in step S1403), it can be determined that the user begins to drag theboundary 1205. In this case, the process advances to step S1404. On the other hand, if the cursor is located on one of the remaining boundaries (on theboundary 1206 or the like) (“another” in step S1403), it can be determined that the user begins to drag another boundary. In this case, the process advances to step S1405. In step S1405, theCPU 101 executes window resizing processing by dragging of another boundary. - In step S1404, the
CPU 101 determines the position coordinates P(Px, Py), BL(BLx, BLy), QL(QLx, QLy), and QR(QRx, QRy) at the beginning of dragging, as shown inFIG. 12 , for the window which begins to be dragged. Note that the definitions of respective coordinates are the same as those described above. - In step S1406, the
CPU 101 further acquires the information of the first instruction operation and moving amount information, and also operation information of the second button (information of a second instruction operation) of themouse 120 ordigital pen 130 of theoperation unit 109. Also, theCPU 101 updates the cursor position coordinate P(Px, Py) based on the moving amount information. TheCPU 101 then determines in step S1407 whether or not the first button is kept ON. If the first button is not kept ON but is switched to OFF (“NO” in step S1407), this processing ends. In this case, a so-called “drop” operation is made. - On the other hand, if the first switch is kept ON (“YES” in step S1407), the process advances to step S1408. In step S1408, the
CPU 101 sets the X component BLx of the end position BL of theboundary 1205 to match the X component Px of the cursor position P updated in step S1406. In this way, the position of theboundary 1205 follows the cursor movement. - The
CPU 101 determines in step S1409 based on the coordinate Py of the cursor position in the Y direction obtained in step 1406 on which of the first and second regions the cursor P is located and based on the operation information of the second button if the second button is ON. - If the cursor P is located on the first region, and the second button is ON, the process advances to step S1410. If the cursor P is located on the first region, and the second button is OFF, the process advances to step S1411. Furthermore, if the cursor P is located on the second region, and the second button is ON, the process advances to step S1412. Moreover, if the cursor P is located on the second region, and the second button is OFF, the process advances to step 1413.
- In step S1410, the
CPU 101 sets the moving amount ΔQLx of the position QL of the arbitrary display contents in the X direction on the left sub-window 1207 as the first side of theboundary 1205 to be equal to the moving amount ΔPx of the cursor P in the X direction. Also, theCPU 101 sets the moving amount ΔQRx of the position QR of the arbitrary display contents in the X direction on the right sub-window 1208 as the second side of theboundary 1205 to be equal to the moving amount ΔPx of the cursor P in the X direction. As a result, the display contents on the sub-windows are scrolled by a size corresponding to the change amount of theboundary 1205 in the X direction. - In step S1411, the
CPU 101 sets the moving amount ΔQLx of the position QL of the arbitrary display contents in the X direction on the left sub-window 1207 as the first side of theboundary 1205 to be equal to the moving amount ΔPx of the cursor P in the X direction. Also, theCPU 101 sets the moving amount ΔQRx of the position QR of the arbitrary display contents in the X direction on the right sub-window 1208 as the second side of theboundary 1205 to be zero. In this way, the display contents on the left sub-window 1207 are scrolled by a size corresponding to the change amount of theboundary 1205 in the X direction. On the other hand, the display contents on the right sub-window 1208 are not scrolled. - In step S1412, the
CPU 101 sets the moving amount ΔQLx of the position QL of the arbitrary display contents in the X direction on the left sub-window 1207 as the first side of theboundary 1205 to be zero. Also, theCPU 101 sets the moving amount ΔQRx of the position QR of the arbitrary display contents in the X direction on the right sub-window 1208 as the second side of theboundary 1205 to be equal to the moving amount APx of the cursor P in the X direction. In this way, the display contents on the left sub-window 1207 are not scrolled. On the other hand, the display contents on the right sub-window 1208 are scrolled by a size corresponding to the change amount of theboundary 1205 in the X direction. - In step S1413, the
CPU 101 sets the moving amount ΔQLx of the position QL of the arbitrary display contents in the X direction on the left sub-window 1207 as the first side of theboundary 1205 to be zero. Also, theCPU 101 sets the moving amount ΔQRx of the position QR of the arbitrary display contents in the X direction on the right sub-window 1208 as the second side of theboundary 1205 to be zero. In this way, the display contents on the sub-windows 1207 and 1208 are not scrolled. - In step S1414, the
CPU 101 updates displays of the cursor andwindow 1200. TheCPU 101 executes this updating process based on the position BLx of theboundary 1205 determined in step S1408, and the moving amounts ΔQLx and ΔQRx determined in any of steps S1410 to S1413. After that, the process returns to step S1406 to continue the processing. - Note that a loop from step S1406 to step S1412 represents cursor movement during dragging, that is, that dragging is continued and resizing of the window is in progress during this loop. When the control leaves this loop, this represents that the drop operation is made to settle the window size.
- The operation of this embodiment has been described. Note that the same display control method according to this embodiment can be applied to not only the window of the configuration shown in
FIGS. 12 and 13 but also to a window divided into upper and lower sub-windows. Furthermore, the method of this embodiment can be applied to a window divided into upper, lower, left and right sub-windows, as shown inFIG. 11 . - The window shown in
FIG. 11 is normally configured, so that a boundary which divides the upper and lower sub-windows and that which divides the right and left sub-windows are independently operable. Hence, by executing the same processing as that shown inFIG. 14 in turn to these boundaries, the display control method of this embodiment can be applied. - In this case, the first and second regions are required to be defined on each boundary. As shown in
FIG. 15A , the length of each boundary may be equally divided. Alternatively, as shown inFIG. 15B , a part divided by an intersection of the vertical and horizontal boundaries may be equally divided. In case ofFIG. 15B , the lengths of the first and second regions change sequentially depending on the position of the intersection. - As described above, according to this embodiment, when a windows is divided into sub-windows by a boundary, the change method of the display contents in the sub-windows can be controlled simultaneously with resizing of the sub-windows. In this way, a desired display result can be obtained by a series of operations, thus improving the work efficiency.
- This embodiment proposes display control which is executed in association with the scrolling ON/OFF control method upon resizing a window, that is proposed by the present invention.
- Conventionally, display control executed upon resizing includes control for switching ON/OFF of scrolling or a scroll ratio of the display contents according to dragging of a border or corner, control for reducing or enlarging the display contents according to dragging of a border or corner, or the like.
- In general, when the display contents are scrolled upon resizing, the contents on an area opposite to the dragged part are hidden. On the other hand, when the display contents are not scrolled upon resizing, the contents of an area near the dragged part are hidden. (Note that the “area opposite to the dragged part” is an area near a border opposite to the dragged border, or an area near two borders that do not contact the dragged corner. The “area near the dragged part” is an area near the dragged border or an area near two borders that contact the dragged corner.)
- When a part of the window is hidden, the usability may often be impaired. Hence, it is desired to display such part although imperfectly. Hence, in this embodiment, object images such as characters, patterns, photos, and the like, which are located on an area to be normally hidden, are displayed while being jammed into the area to be hidden, so as to allow the user to see them.
- For example, display contents shown in
FIG. 16 are assumed. This may be a normal window described in the first embodiment or may be a window which is described in the second embodiment, and is always maximized in one direction (Y direction) within the display screen. InFIG. 16 , aleft border 1601 is movable by dragging, and awindow 1600 can be resized by moving thisborder 1601. -
FIGS. 17A and 17B show display examples when the user resizes (reduces) the window by dragging the border in this embodiment.FIG. 17A shows a display example upon resizing with scrolling. With this display control, respective objects move to the right upon resizing, and their movement stops when these objects are brought into contact with the opposing border. In this case, the objects are displayed to overlap each other near the opposing border. -
FIG. 17B shows a display example upon resizing without scrolling. With this display control, since scrolling is not made, all objects are displayed without moving their position at the beginning of dragging of the border. However, when the dragged border moves to the right and is brought into contact with respective objects, these objects begin to move to the right. In this case, the objects are displayed to overlap each other near the dragged border. As the overlapping order, a newly stopped object may be displayed in front of or behind a preexistent object. - According to such display control, display control as if objects attached to a window were being scooped by a wiper can be implemented, and objects which are normally hidden are displayed although imperfectly, thus improving the usability.
-
FIGS. 18A and 18B show display examples upon resizing a window by dragging one corner of the window. -
FIG. 18A shows a display example upon resizing with scrolling, andFIG. 18B shows that upon resizing without scrolling. The respective operations have the same contents described usingFIGS. 17A and 17B for X and Y components. - In order to allow the user to recognize an object group to be normally hidden more easily, a method shown in
FIGS. 19A and 19B is also available.FIG. 19A shows a display example upon resizing with scrolling. In this case, the following display control is executed. That is, respective objects move to the right upon resizing, and their movement stops when respective objects are brought into contact with the opposing border. In addition, when such object is brought into contact with another object whose movement has already stopped previously, the movement of that object stops at that time. As a result, objects are displayed not to overlap each other unlike inFIG. 17B . -
FIG. 19B shows a case upon resizing without scrolling. The following display control is executed. That is, all objects stand still initially. When the dragged border moves to the right and is brought into contact with respective objects, these objects begin to move to the right. In addition, when the objects which have already begun to move are brought into contact with other objects, the other objects begin to move at that time. As a result, objects are displayed not to overlap each other unlike inFIG. 17B . - Note that upon resizing using the corner, the operations have the same contents for X and Y components.
- The display control processing according to this embodiment will be described below with reference to the flowchart shown in
FIG. 20 .FIG. 20 is a flowchart showing an example of the window resizing processing corresponding to the display examples shown inFIGS. 17A and 17B . The processing corresponding to the flowchart shown inFIG. 20 is implemented when aCPU 101 reads out a corresponding processing program stored in anHD 103 onto aRAM 102 and executes that program to control respective components. - The
CPU 101 determines in step S2001 whether or not the user begins to drag a border. If the user begins to drag the border (“YES” in step S2001), the process advances to step S2002. TheCPU 101 determines in step S2002 if scrolling is ON simultaneously with resizing of a window by dragging. If it is determined that scrolling is OFF (“NO” in step S2002), the process advances to step S2003; otherwise (“YES” in step S2002), the process advances to step S2005. Note that ON/OFF of scrolling can be determined according to the processes described in the first to fourth embodiments. - A case will be examined below wherein a display area of an object O is expressed by O{(O1 x, O1 y), (O2 x, O2 y)}. Note that (O1 x, O1 y) represents the coordinates of the upper left end of the object, and (O2 x, O2 y) represents the coordinates of the lower right end of the object. Note that the left direction corresponds to a negative direction of the X-axis on an X-Y coordinate
system 502 set in association with the display screen, and the up direction corresponds to a positive direction of the Y-axis. Likewise, the right direction corresponds to a positive direction of the X-axis, and the down direction corresponds to a negative direction of the Y-axis. Let ΔO(ΔO1 x, ΔO2 x) be a change in display area O in the X-axis direction. - If it is determined in step S2002 that scrolling is OFF, the display position of the object O is basically not changed. That is, the change amount ΔO=(0, 0) of the coordinates of the display area. On the other hand, if it is determined in step S2002 that scrolling is ON, the display position of the object O is changed according to the drag amount. For example, letting Bx be the coordinate of the dragged border in the X direction, and ΔBx be the moving amount, the change amount ΔOx=(ΔBx, ΔBx) of the display area of the object in the X direction. Note that display of such standard objects is not the gist of this embodiment, and is not described in the flowchart of
FIG. 20 . However, in practice, this display control is applied to objects which do not contact the dragged border or opposing border. - The following explanation will continue while focusing on an object which is in contact with the dragged border or its opposing border.
- The
CPU 101 determines in step S2003 whether or not there is an object which is in contact with the dragged border. This determination process can be attained by comparing the coordinates of the display position of the object, and those of the dragged border. At this time, when the X-coordinate Bx of the dragged border falls within a range O1 x≦Bx≦O2 x, it can be considered that the object is in contact with the dragged border. Note that since the flowchart ofFIG. 20 assumes the case ofFIGS. 17A and 17B , that is, the case of dragging the border in the X direction, only the coordinate in the X-axis direction is considered. In addition, when a border also moves in the Y direction, whether or not an object is in contact with the dragged border can be determined by seeing whether or not the position By of the border in the Y direction falls within the range of that object. - If it is determined that there is an object that is in contact with the dragged border (“YES” in step S2003), the process advances to step S2004. On the other hand, if it is determined that there is no object that is in contact with the dragged border (“NO” in step S2003), the process jumps to step S2007.
- In step S2004, the
CPU 101 changes the display position of the object which is determined to contact the border according to the moving amount ΔBx of the border. That is, theCPU 101 sets the moving amount ΔOx=(0, 0) of the object before contact to be equal to ΔOx=(ΔBx, ΔBx), so as to be matched with the moving amount of the dragged border. As a result, if scrolling is OFF, the display position of the object which is in contact with the dragged border can be moved and displayed together with the dragged border. After that, the process advances to step S2007. - If scrolling is executed simultaneously with dragging of the border, the
CPU 101 determines in step S2005 whether or not there is an object that is in contact with the border opposite to the dragged border. - In this case as well, letting BOx be the X-coordinate of the opposing border, if BOx falls within a range O1 x≦BOx≦O2 x, it can be considered that the object is in contact with the opposing border. Note that since the flowchart of
FIG. 20 assumes the case ofFIGS. 17A and 17B , that is, the case of dragging the border in the X direction, only the coordinate in the X-axis direction is considered. In addition, when a border also moves in the Y direction, whether or not an object is in contact with the opposing border can be determined by seeing whether or not the position BOy of the opposing border in the Y direction falls within the range of that object. - If it is determined that there is an object that contacts the opposing border (“YES” in step S2005), the process advances to step S2006. On the other hand, if it is determined that there is no object that contacts the opposing border (“NO” in step S2005), the process jumps to step S2007.
- In step S2006, the
CPU 101 fixes the display position of the object which is determined to contact at the current display position. That is, the object is scrolled before contact to have ΔOx=(ΔBx, ΔBx) in accordance with the change amount by dragging, and its scrolling is stopped to have ΔOx=(0, 0). In this way, even when scrolling is executed as a whole, the display position of the object which is in contact with the opposing border is fixed near the opposing border, so that the object stays within the window display area. After that, the process advances to step S2007. - In step S2007, the
CPU 101 updates display of the object which is in contact with the border based on the moving amount of the object determined in step S2004 or S2006. TheCPU 101 updates display of other objects according to ON/OFF of scrolling based on the determination result in step S2002. - The
CPU 101 determines in step S2008 whether or not the user ends dragging. If it is determined that the user ends dragging (“YES” in step S2008), this processing ends. On the other hand, if it is determined that the user does not end dragging (“NO” in step S2008), the process returns to step S2002 to continue the processing. - The processing has been described taking as an example the case of
FIGS. 17A and 17B . By extending the aforementioned processing also in the Y direction, the display control corresponding toFIGS. 18A and 18B can be implemented. As for display associated withFIGS. 19A and 19B , whether or not objects are in contact with each other needs to be further determined. Then, in case of “with scrolling”, upon detection of a contact with the border or object, a change in display position of that object is stopped (i.e., ΔOx=(0, 0)). On the other hand, in case of “without scrolling”, upon detection of a contact with the border or object, a change in display position of that object is started (i.e., ΔOx=(ΔBx, ΔBx)). - Furthermore, even when a window is divided into sub-windows by a boundary like in the fourth embodiment, the display control of objects within a display area can be implemented based on the presence/absence of a contact with the boundary or border in the same manner as described above.
- As described above, even when the display contents are scrolled simultaneously with dragging, when an object in the display contents is in contact with an element (border or boundary) of the window, scrolling of the contact object can be suppressed. Even when the display contents are not scrolled simultaneously with dragging, when an object in the display contents is in contact with an element (border or boundary) of the window, the contact object can be scrolled.
- On the other hand, even when the display contents are scrolled simultaneously with dragging, when an object in the display contents is in contact with another object whose scrolling has already been suppressed, scrolling of the contact object can also be suppressed. Even when the display contents are not scrolled simultaneously with dragging, when an object in the display contents is in contact with another object which has already been scrolled, the contact object can also be scrolled.
- In this way, display control as if objects attached to a window were being scooped by a wiper can be implemented, and objects which are normally hidden are displayed although imperfectly, thus further improving the usability.
- The above-described exemplary embodiments of the present invention can also be achieved by providing a computer-readable storage medium that stores program code of software (computer program) which realizes the operations of the above-described exemplary embodiments, to a system or an apparatus. Further, the above-described exemplary embodiments can be achieved by program code (computer program) stored in a storage medium read and executed by a computer (CPU or micro-processing unit (MPU)) of a system or an apparatus.
- The computer program realizes each step included in the flowcharts of the above-mentioned exemplary embodiments. Namely, the computer program is a program that corresponds to each processing unit of each step included in the flowcharts for causing a computer to function. In this case, the computer program itself read from a computer-readable storage medium realizes the operations of the above-described exemplary embodiments, and the storage medium storing the computer program constitutes the present invention.
- Further, the storage medium which provides the computer program can be, for example, a floppy disk, a hard disk, a magnetic storage medium such as a magnetic tape, an optical/magneto-optical storage medium such as a magneto-optical disk (MO), a compact disc (CD), a digital versatile disc (DVD), a CD read-only memory (CD-ROM), a CD recordable (CD-R), a nonvolatile semiconductor memory, a ROM and so on.
- Further, an OS or the like working on a computer can also perform a part or the whole of processes according to instructions of the computer program and realize functions of the above-described exemplary embodiments.
- In the above-described exemplary embodiments, the CPU jointly executes each step in the flowchart with a memory, hard disk, a display device and so on. However, the present invention is not limited to the above configuration, and a dedicated electronic circuit can perform a part or the whole of processes in each step described in each flowchart in place of the CPU.
- While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
- This application claims the benefit of Japanese Patent Application No. 2007-186326, filed Jul. 17, 2007, which is hereby incorporated by reference herein in its entirety.
Claims (16)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/345,230 US9389746B2 (en) | 2007-07-17 | 2012-01-06 | Information processing apparatus and control method thereof, and computer program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2007-186326 | 2007-07-17 | ||
JP2007186326A JP5184832B2 (en) | 2007-07-17 | 2007-07-17 | Information processing apparatus, control method therefor, and computer program |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/345,230 Continuation US9389746B2 (en) | 2007-07-17 | 2012-01-06 | Information processing apparatus and control method thereof, and computer program |
Publications (2)
Publication Number | Publication Date |
---|---|
US20090024956A1 true US20090024956A1 (en) | 2009-01-22 |
US8112716B2 US8112716B2 (en) | 2012-02-07 |
Family
ID=40265879
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/170,994 Expired - Fee Related US8112716B2 (en) | 2007-07-17 | 2008-07-10 | Information processing apparatus and control method thereof, and computer program |
US13/345,230 Expired - Fee Related US9389746B2 (en) | 2007-07-17 | 2012-01-06 | Information processing apparatus and control method thereof, and computer program |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/345,230 Expired - Fee Related US9389746B2 (en) | 2007-07-17 | 2012-01-06 | Information processing apparatus and control method thereof, and computer program |
Country Status (2)
Country | Link |
---|---|
US (2) | US8112716B2 (en) |
JP (1) | JP5184832B2 (en) |
Cited By (153)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100053221A1 (en) * | 2008-09-03 | 2010-03-04 | Canon Kabushiki Kaisha | Information processing apparatus and operation method thereof |
US20110164062A1 (en) * | 2008-09-12 | 2011-07-07 | Fujitsu Ten Limited | Information processing device and image processing device |
US20120319971A1 (en) * | 2011-06-17 | 2012-12-20 | Konica Minolta Business Technologies, Inc. | Information viewing apparatus, control program and controlling method |
US20130174082A1 (en) * | 2011-12-29 | 2013-07-04 | Nefaur R. Khandker | Device, Method, and Graphical User Interface for Resizing Content Viewing and Text Entry Interfaces |
US8775972B2 (en) * | 2012-11-08 | 2014-07-08 | Snapchat, Inc. | Apparatus and method for single action control of social network profile access |
US20140208264A1 (en) * | 2011-07-29 | 2014-07-24 | Rakuten, Inc. | Information processing device, method for controlling information processing device, program and information recording medium |
US20150095845A1 (en) * | 2013-09-30 | 2015-04-02 | Samsung Electronics Co., Ltd. | Electronic device and method for providing user interface in electronic device |
US9083770B1 (en) | 2013-11-26 | 2015-07-14 | Snapchat, Inc. | Method and system for integrating real time communication features in applications |
US20150199093A1 (en) * | 2012-09-26 | 2015-07-16 | Google Inc. | Intelligent window management |
US9094137B1 (en) | 2014-06-13 | 2015-07-28 | Snapchat, Inc. | Priority based placement of messages in a geo-location based event gallery |
US9225897B1 (en) | 2014-07-07 | 2015-12-29 | Snapchat, Inc. | Apparatus and method for supplying content aware photo filters |
US9237202B1 (en) | 2014-03-07 | 2016-01-12 | Snapchat, Inc. | Content delivery network for ephemeral objects |
US9276886B1 (en) | 2014-05-09 | 2016-03-01 | Snapchat, Inc. | Apparatus and method for dynamically configuring application component tiles |
US20160132208A1 (en) * | 2012-09-17 | 2016-05-12 | Huawei Device Co., Ltd. | Touch Operation Processing Method and Terminal Device |
US20160188148A1 (en) * | 2014-12-24 | 2016-06-30 | Samsung Electronics Co., Ltd. | Display apparatus and display method |
US9385983B1 (en) | 2014-12-19 | 2016-07-05 | Snapchat, Inc. | Gallery of messages from individuals with a shared interest |
US9396354B1 (en) | 2014-05-28 | 2016-07-19 | Snapchat, Inc. | Apparatus and method for automated privacy protection in distributed images |
US9537811B2 (en) | 2014-10-02 | 2017-01-03 | Snap Inc. | Ephemeral gallery of ephemeral messages |
US9705831B2 (en) | 2013-05-30 | 2017-07-11 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US9721394B2 (en) | 2012-08-22 | 2017-08-01 | Snaps Media, Inc. | Augmented reality virtual content platform apparatuses, methods and systems |
US9742713B2 (en) | 2013-05-30 | 2017-08-22 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US9843720B1 (en) | 2014-11-12 | 2017-12-12 | Snap Inc. | User interface for accessing media at a geographic location |
US9843665B2 (en) | 2011-05-27 | 2017-12-12 | Microsoft Technology Licensing, Llc | Display of immersive and desktop shells |
US9854219B2 (en) | 2014-12-19 | 2017-12-26 | Snap Inc. | Gallery of videos set to an audio time line |
US9866999B1 (en) | 2014-01-12 | 2018-01-09 | Investment Asset Holdings Llc | Location-based messaging |
US9936030B2 (en) | 2014-01-03 | 2018-04-03 | Investel Capital Corporation | User content sharing system and method with location-based external content integration |
US10055717B1 (en) | 2014-08-22 | 2018-08-21 | Snap Inc. | Message processor with application prompts |
US10082926B1 (en) | 2014-02-21 | 2018-09-25 | Snap Inc. | Apparatus and method for alternate channel communication initiated through a common message thread |
US10123166B2 (en) | 2015-01-26 | 2018-11-06 | Snap Inc. | Content request by location |
US10135949B1 (en) | 2015-05-05 | 2018-11-20 | Snap Inc. | Systems and methods for story and sub-story navigation |
US10133705B1 (en) | 2015-01-19 | 2018-11-20 | Snap Inc. | Multichannel system |
US10157449B1 (en) | 2015-01-09 | 2018-12-18 | Snap Inc. | Geo-location-based image filters |
US10165402B1 (en) | 2016-06-28 | 2018-12-25 | Snap Inc. | System to track engagement of media items |
US10203855B2 (en) | 2016-12-09 | 2019-02-12 | Snap Inc. | Customized user-controlled media overlays |
US10219111B1 (en) | 2018-04-18 | 2019-02-26 | Snap Inc. | Visitation tracking system |
US10223397B1 (en) | 2015-03-13 | 2019-03-05 | Snap Inc. | Social graph based co-location of network users |
US10284508B1 (en) | 2014-10-02 | 2019-05-07 | Snap Inc. | Ephemeral gallery of ephemeral messages with opt-in permanence |
US10311916B2 (en) | 2014-12-19 | 2019-06-04 | Snap Inc. | Gallery of videos set to an audio time line |
US10319149B1 (en) | 2017-02-17 | 2019-06-11 | Snap Inc. | Augmented reality anamorphosis system |
US10327096B1 (en) | 2018-03-06 | 2019-06-18 | Snap Inc. | Geo-fence selection system |
US10334307B2 (en) | 2011-07-12 | 2019-06-25 | Snap Inc. | Methods and systems of providing visual content editing functions |
US10348662B2 (en) | 2016-07-19 | 2019-07-09 | Snap Inc. | Generating customized electronic messaging graphics |
US10354425B2 (en) | 2015-12-18 | 2019-07-16 | Snap Inc. | Method and system for providing context relevant media augmentation |
US10366543B1 (en) | 2015-10-30 | 2019-07-30 | Snap Inc. | Image based tracking in augmented reality systems |
US10387730B1 (en) | 2017-04-20 | 2019-08-20 | Snap Inc. | Augmented reality typography personalization system |
US10387514B1 (en) | 2016-06-30 | 2019-08-20 | Snap Inc. | Automated content curation and communication |
US10417018B2 (en) | 2011-05-27 | 2019-09-17 | Microsoft Technology Licensing, Llc | Navigation of immersive and desktop shells |
US10423983B2 (en) | 2014-09-16 | 2019-09-24 | Snap Inc. | Determining targeting information based on a predictive targeting model |
US10430838B1 (en) | 2016-06-28 | 2019-10-01 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections with automated advertising |
US10439972B1 (en) | 2013-05-30 | 2019-10-08 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US10474321B2 (en) | 2015-11-30 | 2019-11-12 | Snap Inc. | Network resource location linking and visual content sharing |
US10499191B1 (en) | 2017-10-09 | 2019-12-03 | Snap Inc. | Context sensitive presentation of content |
US10523625B1 (en) | 2017-03-09 | 2019-12-31 | Snap Inc. | Restricted group content collection |
US10581782B2 (en) | 2017-03-27 | 2020-03-03 | Snap Inc. | Generating a stitched data stream |
US10582277B2 (en) | 2017-03-27 | 2020-03-03 | Snap Inc. | Generating a stitched data stream |
US10592574B2 (en) | 2015-05-05 | 2020-03-17 | Snap Inc. | Systems and methods for automated local story generation and curation |
US10616239B2 (en) | 2015-03-18 | 2020-04-07 | Snap Inc. | Geo-fence authorization provisioning |
US10623666B2 (en) | 2016-11-07 | 2020-04-14 | Snap Inc. | Selective identification and order of image modifiers |
US10638256B1 (en) | 2016-06-20 | 2020-04-28 | Pipbin, Inc. | System for distribution and display of mobile targeted augmented reality content |
US10678818B2 (en) | 2018-01-03 | 2020-06-09 | Snap Inc. | Tag distribution visualization system |
US10679389B2 (en) | 2016-02-26 | 2020-06-09 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections |
US10679393B2 (en) | 2018-07-24 | 2020-06-09 | Snap Inc. | Conditional modification of augmented reality object |
US10740974B1 (en) | 2017-09-15 | 2020-08-11 | Snap Inc. | Augmented reality system |
US10805696B1 (en) | 2016-06-20 | 2020-10-13 | Pipbin, Inc. | System for recording and targeting tagged content of user interest |
US10817898B2 (en) | 2015-08-13 | 2020-10-27 | Placed, Llc | Determining exposures to content presented by physical objects |
US10824654B2 (en) | 2014-09-18 | 2020-11-03 | Snap Inc. | Geolocation-based pictographs |
US10834525B2 (en) | 2016-02-26 | 2020-11-10 | Snap Inc. | Generation, curation, and presentation of media collections |
US10839219B1 (en) | 2016-06-20 | 2020-11-17 | Pipbin, Inc. | System for curation, distribution and display of location-dependent augmented reality content |
US10860191B2 (en) * | 2016-12-02 | 2020-12-08 | Samsung Electronics Co., Ltd. | Method for adjusting screen size and electronic device therefor |
US10862951B1 (en) | 2007-01-05 | 2020-12-08 | Snap Inc. | Real-time display of multiple images |
US10885136B1 (en) | 2018-02-28 | 2021-01-05 | Snap Inc. | Audience filtering system |
US10915911B2 (en) | 2017-02-03 | 2021-02-09 | Snap Inc. | System to determine a price-schedule to distribute media content |
US10933311B2 (en) | 2018-03-14 | 2021-03-02 | Snap Inc. | Generating collectible items based on location information |
US10948717B1 (en) | 2015-03-23 | 2021-03-16 | Snap Inc. | Reducing boot time and power consumption in wearable display systems |
US10952013B1 (en) | 2017-04-27 | 2021-03-16 | Snap Inc. | Selective location-based identity communication |
US10963529B1 (en) | 2017-04-27 | 2021-03-30 | Snap Inc. | Location-based search mechanism in a graphical user interface |
US10979752B1 (en) | 2018-02-28 | 2021-04-13 | Snap Inc. | Generating media content items based on location information |
US10993069B2 (en) | 2015-07-16 | 2021-04-27 | Snap Inc. | Dynamically adaptive media content delivery |
US10997760B2 (en) | 2018-08-31 | 2021-05-04 | Snap Inc. | Augmented reality anthropomorphization system |
US10997783B2 (en) | 2015-11-30 | 2021-05-04 | Snap Inc. | Image and point cloud based tracking and in augmented reality systems |
US11017173B1 (en) | 2017-12-22 | 2021-05-25 | Snap Inc. | Named entity recognition visual context and caption data |
US11023514B2 (en) | 2016-02-26 | 2021-06-01 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections |
US11030787B2 (en) | 2017-10-30 | 2021-06-08 | Snap Inc. | Mobile-based cartographic control of display content |
US11037372B2 (en) | 2017-03-06 | 2021-06-15 | Snap Inc. | Virtual vision system |
US11044393B1 (en) | 2016-06-20 | 2021-06-22 | Pipbin, Inc. | System for curation and display of location-dependent augmented reality content in an augmented estate system |
US11128715B1 (en) | 2019-12-30 | 2021-09-21 | Snap Inc. | Physical friend proximity in chat |
US11163941B1 (en) | 2018-03-30 | 2021-11-02 | Snap Inc. | Annotating a collection of media content items |
US11170393B1 (en) | 2017-04-11 | 2021-11-09 | Snap Inc. | System to calculate an engagement score of location based media content |
US11182383B1 (en) | 2012-02-24 | 2021-11-23 | Placed, Llc | System and method for data collection to validate location data |
US11189299B1 (en) | 2017-02-20 | 2021-11-30 | Snap Inc. | Augmented reality speech balloon system |
US11201981B1 (en) | 2016-06-20 | 2021-12-14 | Pipbin, Inc. | System for notification of user accessibility of curated location-dependent content in an augmented estate |
US11199957B1 (en) | 2018-11-30 | 2021-12-14 | Snap Inc. | Generating customized avatars based on location information |
US11206615B2 (en) | 2019-05-30 | 2021-12-21 | Snap Inc. | Wearable device location systems |
US11216869B2 (en) | 2014-09-23 | 2022-01-04 | Snap Inc. | User interface to augment an image using geolocation |
US11218838B2 (en) | 2019-10-31 | 2022-01-04 | Snap Inc. | Focused map-based context information surfacing |
US11228551B1 (en) | 2020-02-12 | 2022-01-18 | Snap Inc. | Multiple gateway message exchange |
US11232040B1 (en) | 2017-04-28 | 2022-01-25 | Snap Inc. | Precaching unlockable data elements |
US11249614B2 (en) | 2019-03-28 | 2022-02-15 | Snap Inc. | Generating personalized map interface with enhanced icons |
US11250075B1 (en) | 2017-02-17 | 2022-02-15 | Snap Inc. | Searching social media content |
US11265273B1 (en) | 2017-12-01 | 2022-03-01 | Snap, Inc. | Dynamic media overlay with smart widget |
US11290851B2 (en) | 2020-06-15 | 2022-03-29 | Snap Inc. | Location sharing using offline and online objects |
US11294936B1 (en) | 2019-01-30 | 2022-04-05 | Snap Inc. | Adaptive spatial density based clustering |
US11301117B2 (en) | 2019-03-08 | 2022-04-12 | Snap Inc. | Contextual information in chat |
US11314776B2 (en) | 2020-06-15 | 2022-04-26 | Snap Inc. | Location sharing using friend list versions |
US11343323B2 (en) | 2019-12-31 | 2022-05-24 | Snap Inc. | Augmented reality objects registry |
US11361493B2 (en) | 2019-04-01 | 2022-06-14 | Snap Inc. | Semantic texture mapping system |
US11388226B1 (en) | 2015-01-13 | 2022-07-12 | Snap Inc. | Guided personal identity based actions |
US11429618B2 (en) | 2019-12-30 | 2022-08-30 | Snap Inc. | Surfacing augmented reality objects |
US11430091B2 (en) | 2020-03-27 | 2022-08-30 | Snap Inc. | Location mapping for large scale augmented-reality |
US11455082B2 (en) | 2018-09-28 | 2022-09-27 | Snap Inc. | Collaborative achievement interface |
US11475254B1 (en) | 2017-09-08 | 2022-10-18 | Snap Inc. | Multimodal entity identification |
US11483267B2 (en) | 2020-06-15 | 2022-10-25 | Snap Inc. | Location sharing using different rate-limited links |
US11500525B2 (en) | 2019-02-25 | 2022-11-15 | Snap Inc. | Custom media overlay system |
US11503432B2 (en) | 2020-06-15 | 2022-11-15 | Snap Inc. | Scalable real-time location sharing framework |
US20220365632A1 (en) * | 2021-05-17 | 2022-11-17 | Apple Inc. | Interacting with notes user interfaces |
US11507614B1 (en) | 2018-02-13 | 2022-11-22 | Snap Inc. | Icon based tagging |
US11516167B2 (en) | 2020-03-05 | 2022-11-29 | Snap Inc. | Storing data based on device location |
US11558709B2 (en) | 2018-11-30 | 2023-01-17 | Snap Inc. | Position service to determine relative position to map features |
US11574431B2 (en) | 2019-02-26 | 2023-02-07 | Snap Inc. | Avatar based on weather |
US11601783B2 (en) | 2019-06-07 | 2023-03-07 | Snap Inc. | Detection of a physical collision between two client devices in a location sharing system |
US11601888B2 (en) | 2021-03-29 | 2023-03-07 | Snap Inc. | Determining location using multi-source geolocation data |
US11606755B2 (en) | 2019-05-30 | 2023-03-14 | Snap Inc. | Wearable device location systems architecture |
US11616745B2 (en) | 2017-01-09 | 2023-03-28 | Snap Inc. | Contextual generation and selection of customized media content |
US11619501B2 (en) | 2020-03-11 | 2023-04-04 | Snap Inc. | Avatar based on trip |
US11625443B2 (en) | 2014-06-05 | 2023-04-11 | Snap Inc. | Web document enhancement |
US11631276B2 (en) | 2016-03-31 | 2023-04-18 | Snap Inc. | Automated avatar generation |
US11645324B2 (en) | 2021-03-31 | 2023-05-09 | Snap Inc. | Location-based timeline media content system |
US11675831B2 (en) | 2017-05-31 | 2023-06-13 | Snap Inc. | Geolocation based playlists |
US11676378B2 (en) | 2020-06-29 | 2023-06-13 | Snap Inc. | Providing travel-based augmented reality content with a captured image |
US11714535B2 (en) | 2019-07-11 | 2023-08-01 | Snap Inc. | Edge gesture interface with smart interactions |
US11729343B2 (en) | 2019-12-30 | 2023-08-15 | Snap Inc. | Including video feed in message thread |
US11734712B2 (en) | 2012-02-24 | 2023-08-22 | Foursquare Labs, Inc. | Attributing in-store visits to media consumption based on data collected from user devices |
US11751015B2 (en) | 2019-01-16 | 2023-09-05 | Snap Inc. | Location-based context information sharing in a messaging system |
US11776256B2 (en) | 2020-03-27 | 2023-10-03 | Snap Inc. | Shared augmented reality system |
US11785161B1 (en) | 2016-06-20 | 2023-10-10 | Pipbin, Inc. | System for user accessibility of tagged curated augmented reality content |
US11799811B2 (en) | 2018-10-31 | 2023-10-24 | Snap Inc. | Messaging and gaming applications communication platform |
US11809624B2 (en) | 2019-02-13 | 2023-11-07 | Snap Inc. | Sleep detection in a location sharing system |
US11816853B2 (en) | 2016-08-30 | 2023-11-14 | Snap Inc. | Systems and methods for simultaneous localization and mapping |
US11821742B2 (en) | 2019-09-26 | 2023-11-21 | Snap Inc. | Travel based notifications |
US11829834B2 (en) | 2021-10-29 | 2023-11-28 | Snap Inc. | Extended QR code |
US11842411B2 (en) | 2017-04-27 | 2023-12-12 | Snap Inc. | Location-based virtual avatars |
US11843456B2 (en) | 2016-10-24 | 2023-12-12 | Snap Inc. | Generating and displaying customized avatars in media overlays |
US11852554B1 (en) | 2019-03-21 | 2023-12-26 | Snap Inc. | Barometer calibration in a location sharing system |
US11860888B2 (en) | 2018-05-22 | 2024-01-02 | Snap Inc. | Event detection system |
US11870743B1 (en) | 2017-01-23 | 2024-01-09 | Snap Inc. | Customized digital avatar accessories |
US11868414B1 (en) | 2019-03-14 | 2024-01-09 | Snap Inc. | Graph-based prediction for contact suggestion in a location sharing system |
US11877211B2 (en) | 2019-01-14 | 2024-01-16 | Snap Inc. | Destination sharing in location sharing system |
US11876941B1 (en) | 2016-06-20 | 2024-01-16 | Pipbin, Inc. | Clickable augmented reality content manager, system, and network |
US11893208B2 (en) | 2019-12-31 | 2024-02-06 | Snap Inc. | Combined map icon with action indicator |
US11900418B2 (en) | 2016-04-04 | 2024-02-13 | Snap Inc. | Mutable geo-fencing system |
US11925869B2 (en) | 2012-05-08 | 2024-03-12 | Snap Inc. | System and method for generating and displaying avatars |
US11943192B2 (en) | 2020-08-31 | 2024-03-26 | Snap Inc. | Co-location connection service |
US11961116B2 (en) | 2020-10-26 | 2024-04-16 | Foursquare Labs, Inc. | Determining exposures to content presented by physical objects |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5375338B2 (en) | 2009-05-29 | 2013-12-25 | セイコーエプソン株式会社 | Image display system, image display apparatus, image display method, image supply apparatus, and program |
US9401099B2 (en) * | 2010-05-11 | 2016-07-26 | AI Squared | Dedicated on-screen closed caption display |
US8924885B2 (en) * | 2011-05-27 | 2014-12-30 | Microsoft Corporation | Desktop as immersive application |
US20130063494A1 (en) * | 2011-09-12 | 2013-03-14 | Microsoft Corporation | Assistive reading interface |
KR102089951B1 (en) * | 2013-03-14 | 2020-04-14 | 엘지전자 주식회사 | Mobile terminal and control method for the mobile terminal |
JP6188490B2 (en) * | 2013-08-28 | 2017-08-30 | キヤノン株式会社 | Image display apparatus, control method, and computer program |
JP6647103B2 (en) * | 2016-03-23 | 2020-02-14 | キヤノン株式会社 | Display control device and control method thereof |
JP6773977B2 (en) * | 2017-03-01 | 2020-10-21 | 富士通クライアントコンピューティング株式会社 | Terminal device and operation control program |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5815151A (en) * | 1996-03-08 | 1998-09-29 | International Business Machines Corp. | Graphical user interface |
US7051289B1 (en) * | 1997-03-21 | 2006-05-23 | International Business Machines Corporation | Window display device and method, and a recording medium recording a window display control program |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH03214362A (en) * | 1990-01-19 | 1991-09-19 | Fuji Xerox Co Ltd | Computer system |
JP3350570B2 (en) * | 1993-05-10 | 2002-11-25 | 富士通株式会社 | List display method |
JPH0793123A (en) * | 1993-09-20 | 1995-04-07 | Fujitsu Ltd | Display device |
JP3445341B2 (en) * | 1993-12-24 | 2003-09-08 | 株式会社東芝 | Window display device and window display method |
JP2765615B2 (en) * | 1994-08-09 | 1998-06-18 | カシオ計算機株式会社 | Window display control device |
JP3404931B2 (en) * | 1994-11-15 | 2003-05-12 | カシオ計算機株式会社 | Table processing equipment |
JP3760492B2 (en) | 1995-12-28 | 2006-03-29 | 富士ゼロックス株式会社 | Multi-window display device and multi-window display method |
CA2175148C (en) * | 1996-04-26 | 2002-06-11 | Robert Cecco | User interface control for creating split panes in a single window |
JP4281120B2 (en) * | 1998-01-16 | 2009-06-17 | ソニー株式会社 | Editing apparatus and method, and recording medium |
-
2007
- 2007-07-17 JP JP2007186326A patent/JP5184832B2/en not_active Expired - Fee Related
-
2008
- 2008-07-10 US US12/170,994 patent/US8112716B2/en not_active Expired - Fee Related
-
2012
- 2012-01-06 US US13/345,230 patent/US9389746B2/en not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5815151A (en) * | 1996-03-08 | 1998-09-29 | International Business Machines Corp. | Graphical user interface |
US7051289B1 (en) * | 1997-03-21 | 2006-05-23 | International Business Machines Corporation | Window display device and method, and a recording medium recording a window display control program |
Cited By (355)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11588770B2 (en) | 2007-01-05 | 2023-02-21 | Snap Inc. | Real-time display of multiple images |
US10862951B1 (en) | 2007-01-05 | 2020-12-08 | Snap Inc. | Real-time display of multiple images |
US20100053221A1 (en) * | 2008-09-03 | 2010-03-04 | Canon Kabushiki Kaisha | Information processing apparatus and operation method thereof |
US20110164062A1 (en) * | 2008-09-12 | 2011-07-07 | Fujitsu Ten Limited | Information processing device and image processing device |
US8819581B2 (en) * | 2008-09-12 | 2014-08-26 | Fujitsu Ten Limited | Information processing device and image processing device |
US9843665B2 (en) | 2011-05-27 | 2017-12-12 | Microsoft Technology Licensing, Llc | Display of immersive and desktop shells |
US10417018B2 (en) | 2011-05-27 | 2019-09-17 | Microsoft Technology Licensing, Llc | Navigation of immersive and desktop shells |
US8994674B2 (en) * | 2011-06-17 | 2015-03-31 | Konica Minolta Business Technologies, Inc. | Information viewing apparatus, control program and controlling method |
US20120319971A1 (en) * | 2011-06-17 | 2012-12-20 | Konica Minolta Business Technologies, Inc. | Information viewing apparatus, control program and controlling method |
US11451856B2 (en) | 2011-07-12 | 2022-09-20 | Snap Inc. | Providing visual content editing functions |
US10334307B2 (en) | 2011-07-12 | 2019-06-25 | Snap Inc. | Methods and systems of providing visual content editing functions |
US11750875B2 (en) | 2011-07-12 | 2023-09-05 | Snap Inc. | Providing visual content editing functions |
US10999623B2 (en) | 2011-07-12 | 2021-05-04 | Snap Inc. | Providing visual content editing functions |
US9367200B2 (en) * | 2011-07-29 | 2016-06-14 | Rakuten, Inc. | Information processing device, method for controlling information processing device, program and information recording medium |
US20140208264A1 (en) * | 2011-07-29 | 2014-07-24 | Rakuten, Inc. | Information processing device, method for controlling information processing device, program and information recording medium |
US9218123B2 (en) * | 2011-12-29 | 2015-12-22 | Apple Inc. | Device, method, and graphical user interface for resizing content viewing and text entry interfaces |
US10346012B2 (en) | 2011-12-29 | 2019-07-09 | Apple Inc. | Device, method, and graphical user interface for resizing content viewing and text entry interfaces |
US20130174082A1 (en) * | 2011-12-29 | 2013-07-04 | Nefaur R. Khandker | Device, Method, and Graphical User Interface for Resizing Content Viewing and Text Entry Interfaces |
US11182383B1 (en) | 2012-02-24 | 2021-11-23 | Placed, Llc | System and method for data collection to validate location data |
US11734712B2 (en) | 2012-02-24 | 2023-08-22 | Foursquare Labs, Inc. | Attributing in-store visits to media consumption based on data collected from user devices |
US11925869B2 (en) | 2012-05-08 | 2024-03-12 | Snap Inc. | System and method for generating and displaying avatars |
US10169924B2 (en) | 2012-08-22 | 2019-01-01 | Snaps Media Inc. | Augmented reality virtual content platform apparatuses, methods and systems |
US9792733B2 (en) | 2012-08-22 | 2017-10-17 | Snaps Media, Inc. | Augmented reality virtual content platform apparatuses, methods and systems |
US9721394B2 (en) | 2012-08-22 | 2017-08-01 | Snaps Media, Inc. | Augmented reality virtual content platform apparatuses, methods and systems |
US10754539B2 (en) | 2012-09-17 | 2020-08-25 | Huawei Device Co., Ltd. | Touch Operation Processing Method and Terminal Device |
US11112902B2 (en) | 2012-09-17 | 2021-09-07 | Huawei Device Co., Ltd. | Touch operation processing method and terminal device |
US10296204B2 (en) * | 2012-09-17 | 2019-05-21 | Huawei Device Co., Ltd. | Touch operation processing method and terminal device |
US20160132208A1 (en) * | 2012-09-17 | 2016-05-12 | Huawei Device Co., Ltd. | Touch Operation Processing Method and Terminal Device |
US11592924B2 (en) * | 2012-09-17 | 2023-02-28 | Huawei Device Co., Ltd. | Touch operation processing method and terminal device |
US20220043529A1 (en) * | 2012-09-17 | 2022-02-10 | Huawei Device Co., Ltd. | Touch Operation Processing Method and Terminal Device |
US9612713B2 (en) * | 2012-09-26 | 2017-04-04 | Google Inc. | Intelligent window management |
US20150199093A1 (en) * | 2012-09-26 | 2015-07-16 | Google Inc. | Intelligent window management |
US8775972B2 (en) * | 2012-11-08 | 2014-07-08 | Snapchat, Inc. | Apparatus and method for single action control of social network profile access |
US11252158B2 (en) * | 2012-11-08 | 2022-02-15 | Snap Inc. | Interactive user-interface to adjust access privileges |
US9882907B1 (en) | 2012-11-08 | 2018-01-30 | Snap Inc. | Apparatus and method for single action control of social network profile access |
US10887308B1 (en) * | 2012-11-08 | 2021-01-05 | Snap Inc. | Interactive user-interface to adjust access privileges |
US10439972B1 (en) | 2013-05-30 | 2019-10-08 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US11134046B2 (en) | 2013-05-30 | 2021-09-28 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US11115361B2 (en) | 2013-05-30 | 2021-09-07 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US10587552B1 (en) | 2013-05-30 | 2020-03-10 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US9705831B2 (en) | 2013-05-30 | 2017-07-11 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US9742713B2 (en) | 2013-05-30 | 2017-08-22 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US11509618B2 (en) | 2013-05-30 | 2022-11-22 | Snap Inc. | Maintaining a message thread with opt-in permanence for entries |
US20150095845A1 (en) * | 2013-09-30 | 2015-04-02 | Samsung Electronics Co., Ltd. | Electronic device and method for providing user interface in electronic device |
US11546388B2 (en) | 2013-11-26 | 2023-01-03 | Snap Inc. | Method and system for integrating real time communication features in applications |
US9794303B1 (en) | 2013-11-26 | 2017-10-17 | Snap Inc. | Method and system for integrating real time communication features in applications |
US9083770B1 (en) | 2013-11-26 | 2015-07-14 | Snapchat, Inc. | Method and system for integrating real time communication features in applications |
US10069876B1 (en) | 2013-11-26 | 2018-09-04 | Snap Inc. | Method and system for integrating real time communication features in applications |
US10681092B1 (en) | 2013-11-26 | 2020-06-09 | Snap Inc. | Method and system for integrating real time communication features in applications |
US11102253B2 (en) | 2013-11-26 | 2021-08-24 | Snap Inc. | Method and system for integrating real time communication features in applications |
US9936030B2 (en) | 2014-01-03 | 2018-04-03 | Investel Capital Corporation | User content sharing system and method with location-based external content integration |
US10080102B1 (en) | 2014-01-12 | 2018-09-18 | Investment Asset Holdings Llc | Location-based messaging |
US9866999B1 (en) | 2014-01-12 | 2018-01-09 | Investment Asset Holdings Llc | Location-based messaging |
US10349209B1 (en) | 2014-01-12 | 2019-07-09 | Investment Asset Holdings Llc | Location-based messaging |
US10082926B1 (en) | 2014-02-21 | 2018-09-25 | Snap Inc. | Apparatus and method for alternate channel communication initiated through a common message thread |
US10958605B1 (en) | 2014-02-21 | 2021-03-23 | Snap Inc. | Apparatus and method for alternate channel communication initiated through a common message thread |
US10949049B1 (en) | 2014-02-21 | 2021-03-16 | Snap Inc. | Apparatus and method for alternate channel communication initiated through a common message thread |
US10084735B1 (en) | 2014-02-21 | 2018-09-25 | Snap Inc. | Apparatus and method for alternate channel communication initiated through a common message thread |
US11902235B2 (en) | 2014-02-21 | 2024-02-13 | Snap Inc. | Apparatus and method for alternate channel communication initiated through a common message thread |
US11463393B2 (en) | 2014-02-21 | 2022-10-04 | Snap Inc. | Apparatus and method for alternate channel communication initiated through a common message thread |
US11463394B2 (en) | 2014-02-21 | 2022-10-04 | Snap Inc. | Apparatus and method for alternate channel communication initiated through a common message thread |
US9407712B1 (en) | 2014-03-07 | 2016-08-02 | Snapchat, Inc. | Content delivery network for ephemeral objects |
US9237202B1 (en) | 2014-03-07 | 2016-01-12 | Snapchat, Inc. | Content delivery network for ephemeral objects |
US10817156B1 (en) | 2014-05-09 | 2020-10-27 | Snap Inc. | Dynamic configuration of application component tiles |
US11310183B2 (en) | 2014-05-09 | 2022-04-19 | Snap Inc. | Dynamic configuration of application component tiles |
US9276886B1 (en) | 2014-05-09 | 2016-03-01 | Snapchat, Inc. | Apparatus and method for dynamically configuring application component tiles |
US11743219B2 (en) | 2014-05-09 | 2023-08-29 | Snap Inc. | Dynamic configuration of application component tiles |
US9396354B1 (en) | 2014-05-28 | 2016-07-19 | Snapchat, Inc. | Apparatus and method for automated privacy protection in distributed images |
US9785796B1 (en) | 2014-05-28 | 2017-10-10 | Snap Inc. | Apparatus and method for automated privacy protection in distributed images |
US10572681B1 (en) | 2014-05-28 | 2020-02-25 | Snap Inc. | Apparatus and method for automated privacy protection in distributed images |
US10990697B2 (en) | 2014-05-28 | 2021-04-27 | Snap Inc. | Apparatus and method for automated privacy protection in distributed images |
US11921805B2 (en) | 2014-06-05 | 2024-03-05 | Snap Inc. | Web document enhancement |
US11625443B2 (en) | 2014-06-05 | 2023-04-11 | Snap Inc. | Web document enhancement |
US11317240B2 (en) | 2014-06-13 | 2022-04-26 | Snap Inc. | Geo-location based event gallery |
US10182311B2 (en) | 2014-06-13 | 2019-01-15 | Snap Inc. | Prioritization of messages within a message collection |
US9094137B1 (en) | 2014-06-13 | 2015-07-28 | Snapchat, Inc. | Priority based placement of messages in a geo-location based event gallery |
US9113301B1 (en) | 2014-06-13 | 2015-08-18 | Snapchat, Inc. | Geo-location based event gallery |
US10659914B1 (en) | 2014-06-13 | 2020-05-19 | Snap Inc. | Geo-location based event gallery |
US9825898B2 (en) | 2014-06-13 | 2017-11-21 | Snap Inc. | Prioritization of messages within a message collection |
US10623891B2 (en) | 2014-06-13 | 2020-04-14 | Snap Inc. | Prioritization of messages within a message collection |
US9532171B2 (en) | 2014-06-13 | 2016-12-27 | Snap Inc. | Geo-location based event gallery |
US9430783B1 (en) | 2014-06-13 | 2016-08-30 | Snapchat, Inc. | Prioritization of messages within gallery |
US9693191B2 (en) | 2014-06-13 | 2017-06-27 | Snap Inc. | Prioritization of messages within gallery |
US10779113B2 (en) | 2014-06-13 | 2020-09-15 | Snap Inc. | Prioritization of messages within a message collection |
US10448201B1 (en) | 2014-06-13 | 2019-10-15 | Snap Inc. | Prioritization of messages within a message collection |
US10524087B1 (en) | 2014-06-13 | 2019-12-31 | Snap Inc. | Message destination list mechanism |
US11166121B2 (en) | 2014-06-13 | 2021-11-02 | Snap Inc. | Prioritization of messages within a message collection |
US10200813B1 (en) | 2014-06-13 | 2019-02-05 | Snap Inc. | Geo-location based event gallery |
US9225897B1 (en) | 2014-07-07 | 2015-12-29 | Snapchat, Inc. | Apparatus and method for supplying content aware photo filters |
US10348960B1 (en) | 2014-07-07 | 2019-07-09 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US11122200B2 (en) | 2014-07-07 | 2021-09-14 | Snap Inc. | Supplying content aware photo filters |
US11849214B2 (en) | 2014-07-07 | 2023-12-19 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US11595569B2 (en) | 2014-07-07 | 2023-02-28 | Snap Inc. | Supplying content aware photo filters |
US10154192B1 (en) | 2014-07-07 | 2018-12-11 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US9407816B1 (en) | 2014-07-07 | 2016-08-02 | Snapchat, Inc. | Apparatus and method for supplying content aware photo filters |
US10701262B1 (en) | 2014-07-07 | 2020-06-30 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US10432850B1 (en) | 2014-07-07 | 2019-10-01 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US10602057B1 (en) | 2014-07-07 | 2020-03-24 | Snap Inc. | Supplying content aware photo filters |
US11496673B1 (en) | 2014-07-07 | 2022-11-08 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US10055717B1 (en) | 2014-08-22 | 2018-08-21 | Snap Inc. | Message processor with application prompts |
US11017363B1 (en) | 2014-08-22 | 2021-05-25 | Snap Inc. | Message processor with application prompts |
US11625755B1 (en) | 2014-09-16 | 2023-04-11 | Foursquare Labs, Inc. | Determining targeting information based on a predictive targeting model |
US10423983B2 (en) | 2014-09-16 | 2019-09-24 | Snap Inc. | Determining targeting information based on a predictive targeting model |
US10824654B2 (en) | 2014-09-18 | 2020-11-03 | Snap Inc. | Geolocation-based pictographs |
US11741136B2 (en) | 2014-09-18 | 2023-08-29 | Snap Inc. | Geolocation-based pictographs |
US11281701B2 (en) | 2014-09-18 | 2022-03-22 | Snap Inc. | Geolocation-based pictographs |
US11216869B2 (en) | 2014-09-23 | 2022-01-04 | Snap Inc. | User interface to augment an image using geolocation |
US10944710B1 (en) | 2014-10-02 | 2021-03-09 | Snap Inc. | Ephemeral gallery user interface with remaining gallery time indication |
US11855947B1 (en) | 2014-10-02 | 2023-12-26 | Snap Inc. | Gallery of ephemeral messages |
US20170374003A1 (en) | 2014-10-02 | 2017-12-28 | Snapchat, Inc. | Ephemeral gallery of ephemeral messages |
US11038829B1 (en) | 2014-10-02 | 2021-06-15 | Snap Inc. | Ephemeral gallery of ephemeral messages with opt-in permanence |
US11522822B1 (en) | 2014-10-02 | 2022-12-06 | Snap Inc. | Ephemeral gallery elimination based on gallery and message timers |
US9537811B2 (en) | 2014-10-02 | 2017-01-03 | Snap Inc. | Ephemeral gallery of ephemeral messages |
US10958608B1 (en) | 2014-10-02 | 2021-03-23 | Snap Inc. | Ephemeral gallery of visual media messages |
US10708210B1 (en) | 2014-10-02 | 2020-07-07 | Snap Inc. | Multi-user ephemeral message gallery |
US11411908B1 (en) | 2014-10-02 | 2022-08-09 | Snap Inc. | Ephemeral message gallery user interface with online viewing history indicia |
US10284508B1 (en) | 2014-10-02 | 2019-05-07 | Snap Inc. | Ephemeral gallery of ephemeral messages with opt-in permanence |
US11012398B1 (en) | 2014-10-02 | 2021-05-18 | Snap Inc. | Ephemeral message gallery user interface with screenshot messages |
US10476830B2 (en) | 2014-10-02 | 2019-11-12 | Snap Inc. | Ephemeral gallery of ephemeral messages |
US9843720B1 (en) | 2014-11-12 | 2017-12-12 | Snap Inc. | User interface for accessing media at a geographic location |
US10616476B1 (en) | 2014-11-12 | 2020-04-07 | Snap Inc. | User interface for accessing media at a geographic location |
US11956533B2 (en) | 2014-11-12 | 2024-04-09 | Snap Inc. | Accessing media at a geographic location |
US11190679B2 (en) | 2014-11-12 | 2021-11-30 | Snap Inc. | Accessing media at a geographic location |
US10811053B2 (en) | 2014-12-19 | 2020-10-20 | Snap Inc. | Routing messages by message parameter |
US9385983B1 (en) | 2014-12-19 | 2016-07-05 | Snapchat, Inc. | Gallery of messages from individuals with a shared interest |
US11250887B2 (en) | 2014-12-19 | 2022-02-15 | Snap Inc. | Routing messages by message parameter |
US11803345B2 (en) | 2014-12-19 | 2023-10-31 | Snap Inc. | Gallery of messages from individuals with a shared interest |
US10311916B2 (en) | 2014-12-19 | 2019-06-04 | Snap Inc. | Gallery of videos set to an audio time line |
US9854219B2 (en) | 2014-12-19 | 2017-12-26 | Snap Inc. | Gallery of videos set to an audio time line |
US11783862B2 (en) | 2014-12-19 | 2023-10-10 | Snap Inc. | Routing messages by message parameter |
US10514876B2 (en) | 2014-12-19 | 2019-12-24 | Snap Inc. | Gallery of messages from individuals with a shared interest |
US11372608B2 (en) | 2014-12-19 | 2022-06-28 | Snap Inc. | Gallery of messages from individuals with a shared interest |
US10580458B2 (en) | 2014-12-19 | 2020-03-03 | Snap Inc. | Gallery of videos set to an audio time line |
US20160188148A1 (en) * | 2014-12-24 | 2016-06-30 | Samsung Electronics Co., Ltd. | Display apparatus and display method |
US10671265B2 (en) * | 2014-12-24 | 2020-06-02 | Samsung Electronics Co., Ltd. | Display apparatus and display method |
US11734342B2 (en) | 2015-01-09 | 2023-08-22 | Snap Inc. | Object recognition based image overlays |
US10380720B1 (en) | 2015-01-09 | 2019-08-13 | Snap Inc. | Location-based image filters |
US11301960B2 (en) | 2015-01-09 | 2022-04-12 | Snap Inc. | Object recognition based image filters |
US10157449B1 (en) | 2015-01-09 | 2018-12-18 | Snap Inc. | Geo-location-based image filters |
US11388226B1 (en) | 2015-01-13 | 2022-07-12 | Snap Inc. | Guided personal identity based actions |
US10133705B1 (en) | 2015-01-19 | 2018-11-20 | Snap Inc. | Multichannel system |
US10416845B1 (en) | 2015-01-19 | 2019-09-17 | Snap Inc. | Multichannel system |
US11249617B1 (en) | 2015-01-19 | 2022-02-15 | Snap Inc. | Multichannel system |
US10123166B2 (en) | 2015-01-26 | 2018-11-06 | Snap Inc. | Content request by location |
US10536800B1 (en) | 2015-01-26 | 2020-01-14 | Snap Inc. | Content request by location |
US11528579B2 (en) | 2015-01-26 | 2022-12-13 | Snap Inc. | Content request by location |
US10932085B1 (en) | 2015-01-26 | 2021-02-23 | Snap Inc. | Content request by location |
US11910267B2 (en) | 2015-01-26 | 2024-02-20 | Snap Inc. | Content request by location |
US10223397B1 (en) | 2015-03-13 | 2019-03-05 | Snap Inc. | Social graph based co-location of network users |
US10616239B2 (en) | 2015-03-18 | 2020-04-07 | Snap Inc. | Geo-fence authorization provisioning |
US11902287B2 (en) | 2015-03-18 | 2024-02-13 | Snap Inc. | Geo-fence authorization provisioning |
US10893055B2 (en) | 2015-03-18 | 2021-01-12 | Snap Inc. | Geo-fence authorization provisioning |
US11320651B2 (en) | 2015-03-23 | 2022-05-03 | Snap Inc. | Reducing boot time and power consumption in displaying data content |
US11662576B2 (en) | 2015-03-23 | 2023-05-30 | Snap Inc. | Reducing boot time and power consumption in displaying data content |
US10948717B1 (en) | 2015-03-23 | 2021-03-16 | Snap Inc. | Reducing boot time and power consumption in wearable display systems |
US10135949B1 (en) | 2015-05-05 | 2018-11-20 | Snap Inc. | Systems and methods for story and sub-story navigation |
US11449539B2 (en) | 2015-05-05 | 2022-09-20 | Snap Inc. | Automated local story generation and curation |
US10911575B1 (en) | 2015-05-05 | 2021-02-02 | Snap Inc. | Systems and methods for story and sub-story navigation |
US10592574B2 (en) | 2015-05-05 | 2020-03-17 | Snap Inc. | Systems and methods for automated local story generation and curation |
US11496544B2 (en) | 2015-05-05 | 2022-11-08 | Snap Inc. | Story and sub-story navigation |
US10993069B2 (en) | 2015-07-16 | 2021-04-27 | Snap Inc. | Dynamically adaptive media content delivery |
US10817898B2 (en) | 2015-08-13 | 2020-10-27 | Placed, Llc | Determining exposures to content presented by physical objects |
US10366543B1 (en) | 2015-10-30 | 2019-07-30 | Snap Inc. | Image based tracking in augmented reality systems |
US11769307B2 (en) | 2015-10-30 | 2023-09-26 | Snap Inc. | Image based tracking in augmented reality systems |
US10733802B2 (en) | 2015-10-30 | 2020-08-04 | Snap Inc. | Image based tracking in augmented reality systems |
US11315331B2 (en) | 2015-10-30 | 2022-04-26 | Snap Inc. | Image based tracking in augmented reality systems |
US11599241B2 (en) | 2015-11-30 | 2023-03-07 | Snap Inc. | Network resource location linking and visual content sharing |
US11380051B2 (en) | 2015-11-30 | 2022-07-05 | Snap Inc. | Image and point cloud based tracking and in augmented reality systems |
US10997783B2 (en) | 2015-11-30 | 2021-05-04 | Snap Inc. | Image and point cloud based tracking and in augmented reality systems |
US10474321B2 (en) | 2015-11-30 | 2019-11-12 | Snap Inc. | Network resource location linking and visual content sharing |
US10354425B2 (en) | 2015-12-18 | 2019-07-16 | Snap Inc. | Method and system for providing context relevant media augmentation |
US11830117B2 (en) | 2015-12-18 | 2023-11-28 | Snap Inc | Media overlay publication system |
US11468615B2 (en) | 2015-12-18 | 2022-10-11 | Snap Inc. | Media overlay publication system |
US10997758B1 (en) | 2015-12-18 | 2021-05-04 | Snap Inc. | Media overlay publication system |
US10834525B2 (en) | 2016-02-26 | 2020-11-10 | Snap Inc. | Generation, curation, and presentation of media collections |
US11611846B2 (en) | 2016-02-26 | 2023-03-21 | Snap Inc. | Generation, curation, and presentation of media collections |
US11023514B2 (en) | 2016-02-26 | 2021-06-01 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections |
US11197123B2 (en) | 2016-02-26 | 2021-12-07 | Snap Inc. | Generation, curation, and presentation of media collections |
US10679389B2 (en) | 2016-02-26 | 2020-06-09 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections |
US11889381B2 (en) | 2016-02-26 | 2024-01-30 | Snap Inc. | Generation, curation, and presentation of media collections |
US11631276B2 (en) | 2016-03-31 | 2023-04-18 | Snap Inc. | Automated avatar generation |
US11900418B2 (en) | 2016-04-04 | 2024-02-13 | Snap Inc. | Mutable geo-fencing system |
US10839219B1 (en) | 2016-06-20 | 2020-11-17 | Pipbin, Inc. | System for curation, distribution and display of location-dependent augmented reality content |
US10805696B1 (en) | 2016-06-20 | 2020-10-13 | Pipbin, Inc. | System for recording and targeting tagged content of user interest |
US11201981B1 (en) | 2016-06-20 | 2021-12-14 | Pipbin, Inc. | System for notification of user accessibility of curated location-dependent content in an augmented estate |
US11785161B1 (en) | 2016-06-20 | 2023-10-10 | Pipbin, Inc. | System for user accessibility of tagged curated augmented reality content |
US11044393B1 (en) | 2016-06-20 | 2021-06-22 | Pipbin, Inc. | System for curation and display of location-dependent augmented reality content in an augmented estate system |
US10992836B2 (en) | 2016-06-20 | 2021-04-27 | Pipbin, Inc. | Augmented property system of curated augmented reality media elements |
US10638256B1 (en) | 2016-06-20 | 2020-04-28 | Pipbin, Inc. | System for distribution and display of mobile targeted augmented reality content |
US11876941B1 (en) | 2016-06-20 | 2024-01-16 | Pipbin, Inc. | Clickable augmented reality content manager, system, and network |
US10735892B2 (en) | 2016-06-28 | 2020-08-04 | Snap Inc. | System to track engagement of media items |
US10327100B1 (en) | 2016-06-28 | 2019-06-18 | Snap Inc. | System to track engagement of media items |
US10219110B2 (en) | 2016-06-28 | 2019-02-26 | Snap Inc. | System to track engagement of media items |
US10430838B1 (en) | 2016-06-28 | 2019-10-01 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections with automated advertising |
US10165402B1 (en) | 2016-06-28 | 2018-12-25 | Snap Inc. | System to track engagement of media items |
US10885559B1 (en) | 2016-06-28 | 2021-01-05 | Snap Inc. | Generation, curation, and presentation of media collections with automated advertising |
US10785597B2 (en) | 2016-06-28 | 2020-09-22 | Snap Inc. | System to track engagement of media items |
US11640625B2 (en) | 2016-06-28 | 2023-05-02 | Snap Inc. | Generation, curation, and presentation of media collections with automated advertising |
US11445326B2 (en) | 2016-06-28 | 2022-09-13 | Snap Inc. | Track engagement of media items |
US10506371B2 (en) | 2016-06-28 | 2019-12-10 | Snap Inc. | System to track engagement of media items |
US10387514B1 (en) | 2016-06-30 | 2019-08-20 | Snap Inc. | Automated content curation and communication |
US11080351B1 (en) | 2016-06-30 | 2021-08-03 | Snap Inc. | Automated content curation and communication |
US11895068B2 (en) | 2016-06-30 | 2024-02-06 | Snap Inc. | Automated content curation and communication |
US11509615B2 (en) | 2016-07-19 | 2022-11-22 | Snap Inc. | Generating customized electronic messaging graphics |
US10348662B2 (en) | 2016-07-19 | 2019-07-09 | Snap Inc. | Generating customized electronic messaging graphics |
US11816853B2 (en) | 2016-08-30 | 2023-11-14 | Snap Inc. | Systems and methods for simultaneous localization and mapping |
US11876762B1 (en) | 2016-10-24 | 2024-01-16 | Snap Inc. | Generating and displaying customized avatars in media overlays |
US11843456B2 (en) | 2016-10-24 | 2023-12-12 | Snap Inc. | Generating and displaying customized avatars in media overlays |
US10623666B2 (en) | 2016-11-07 | 2020-04-14 | Snap Inc. | Selective identification and order of image modifiers |
US11233952B2 (en) | 2016-11-07 | 2022-01-25 | Snap Inc. | Selective identification and order of image modifiers |
US11750767B2 (en) | 2016-11-07 | 2023-09-05 | Snap Inc. | Selective identification and order of image modifiers |
US10860191B2 (en) * | 2016-12-02 | 2020-12-08 | Samsung Electronics Co., Ltd. | Method for adjusting screen size and electronic device therefor |
US11397517B2 (en) | 2016-12-09 | 2022-07-26 | Snap Inc. | Customized media overlays |
US10203855B2 (en) | 2016-12-09 | 2019-02-12 | Snap Inc. | Customized user-controlled media overlays |
US10754525B1 (en) | 2016-12-09 | 2020-08-25 | Snap Inc. | Customized media overlays |
US11616745B2 (en) | 2017-01-09 | 2023-03-28 | Snap Inc. | Contextual generation and selection of customized media content |
US11870743B1 (en) | 2017-01-23 | 2024-01-09 | Snap Inc. | Customized digital avatar accessories |
US10915911B2 (en) | 2017-02-03 | 2021-02-09 | Snap Inc. | System to determine a price-schedule to distribute media content |
US11250075B1 (en) | 2017-02-17 | 2022-02-15 | Snap Inc. | Searching social media content |
US10319149B1 (en) | 2017-02-17 | 2019-06-11 | Snap Inc. | Augmented reality anamorphosis system |
US11861795B1 (en) | 2017-02-17 | 2024-01-02 | Snap Inc. | Augmented reality anamorphosis system |
US11720640B2 (en) | 2017-02-17 | 2023-08-08 | Snap Inc. | Searching social media content |
US11748579B2 (en) | 2017-02-20 | 2023-09-05 | Snap Inc. | Augmented reality speech balloon system |
US11189299B1 (en) | 2017-02-20 | 2021-11-30 | Snap Inc. | Augmented reality speech balloon system |
US11037372B2 (en) | 2017-03-06 | 2021-06-15 | Snap Inc. | Virtual vision system |
US11670057B2 (en) | 2017-03-06 | 2023-06-06 | Snap Inc. | Virtual vision system |
US10523625B1 (en) | 2017-03-09 | 2019-12-31 | Snap Inc. | Restricted group content collection |
US11258749B2 (en) | 2017-03-09 | 2022-02-22 | Snap Inc. | Restricted group content collection |
US10887269B1 (en) | 2017-03-09 | 2021-01-05 | Snap Inc. | Restricted group content collection |
US11558678B2 (en) | 2017-03-27 | 2023-01-17 | Snap Inc. | Generating a stitched data stream |
US11349796B2 (en) | 2017-03-27 | 2022-05-31 | Snap Inc. | Generating a stitched data stream |
US10582277B2 (en) | 2017-03-27 | 2020-03-03 | Snap Inc. | Generating a stitched data stream |
US10581782B2 (en) | 2017-03-27 | 2020-03-03 | Snap Inc. | Generating a stitched data stream |
US11297399B1 (en) | 2017-03-27 | 2022-04-05 | Snap Inc. | Generating a stitched data stream |
US11170393B1 (en) | 2017-04-11 | 2021-11-09 | Snap Inc. | System to calculate an engagement score of location based media content |
US10387730B1 (en) | 2017-04-20 | 2019-08-20 | Snap Inc. | Augmented reality typography personalization system |
US11195018B1 (en) | 2017-04-20 | 2021-12-07 | Snap Inc. | Augmented reality typography personalization system |
US11893647B2 (en) | 2017-04-27 | 2024-02-06 | Snap Inc. | Location-based virtual avatars |
US11385763B2 (en) | 2017-04-27 | 2022-07-12 | Snap Inc. | Map-based graphical user interface indicating geospatial activity metrics |
US10963529B1 (en) | 2017-04-27 | 2021-03-30 | Snap Inc. | Location-based search mechanism in a graphical user interface |
US10952013B1 (en) | 2017-04-27 | 2021-03-16 | Snap Inc. | Selective location-based identity communication |
US11451956B1 (en) | 2017-04-27 | 2022-09-20 | Snap Inc. | Location privacy management on map-based social media platforms |
US11556221B2 (en) | 2017-04-27 | 2023-01-17 | Snap Inc. | Friend location sharing mechanism for social media platforms |
US11782574B2 (en) | 2017-04-27 | 2023-10-10 | Snap Inc. | Map-based graphical user interface indicating geospatial activity metrics |
US11418906B2 (en) | 2017-04-27 | 2022-08-16 | Snap Inc. | Selective location-based identity communication |
US11842411B2 (en) | 2017-04-27 | 2023-12-12 | Snap Inc. | Location-based virtual avatars |
US11474663B2 (en) | 2017-04-27 | 2022-10-18 | Snap Inc. | Location-based search mechanism in a graphical user interface |
US11392264B1 (en) | 2017-04-27 | 2022-07-19 | Snap Inc. | Map-based graphical user interface for multi-type social media galleries |
US11409407B2 (en) | 2017-04-27 | 2022-08-09 | Snap Inc. | Map-based graphical user interface indicating geospatial activity metrics |
US11232040B1 (en) | 2017-04-28 | 2022-01-25 | Snap Inc. | Precaching unlockable data elements |
US11675831B2 (en) | 2017-05-31 | 2023-06-13 | Snap Inc. | Geolocation based playlists |
US11475254B1 (en) | 2017-09-08 | 2022-10-18 | Snap Inc. | Multimodal entity identification |
US10740974B1 (en) | 2017-09-15 | 2020-08-11 | Snap Inc. | Augmented reality system |
US11335067B2 (en) | 2017-09-15 | 2022-05-17 | Snap Inc. | Augmented reality system |
US11721080B2 (en) | 2017-09-15 | 2023-08-08 | Snap Inc. | Augmented reality system |
US10499191B1 (en) | 2017-10-09 | 2019-12-03 | Snap Inc. | Context sensitive presentation of content |
US11617056B2 (en) | 2017-10-09 | 2023-03-28 | Snap Inc. | Context sensitive presentation of content |
US11006242B1 (en) | 2017-10-09 | 2021-05-11 | Snap Inc. | Context sensitive presentation of content |
US11670025B2 (en) | 2017-10-30 | 2023-06-06 | Snap Inc. | Mobile-based cartographic control of display content |
US11030787B2 (en) | 2017-10-30 | 2021-06-08 | Snap Inc. | Mobile-based cartographic control of display content |
US11558327B2 (en) | 2017-12-01 | 2023-01-17 | Snap Inc. | Dynamic media overlay with smart widget |
US11265273B1 (en) | 2017-12-01 | 2022-03-01 | Snap, Inc. | Dynamic media overlay with smart widget |
US11943185B2 (en) | 2017-12-01 | 2024-03-26 | Snap Inc. | Dynamic media overlay with smart widget |
US11687720B2 (en) | 2017-12-22 | 2023-06-27 | Snap Inc. | Named entity recognition visual context and caption data |
US11017173B1 (en) | 2017-12-22 | 2021-05-25 | Snap Inc. | Named entity recognition visual context and caption data |
US10678818B2 (en) | 2018-01-03 | 2020-06-09 | Snap Inc. | Tag distribution visualization system |
US11487794B2 (en) | 2018-01-03 | 2022-11-01 | Snap Inc. | Tag distribution visualization system |
US11841896B2 (en) | 2018-02-13 | 2023-12-12 | Snap Inc. | Icon based tagging |
US11507614B1 (en) | 2018-02-13 | 2022-11-22 | Snap Inc. | Icon based tagging |
US10885136B1 (en) | 2018-02-28 | 2021-01-05 | Snap Inc. | Audience filtering system |
US11523159B2 (en) | 2018-02-28 | 2022-12-06 | Snap Inc. | Generating media content items based on location information |
US10979752B1 (en) | 2018-02-28 | 2021-04-13 | Snap Inc. | Generating media content items based on location information |
US11722837B2 (en) | 2018-03-06 | 2023-08-08 | Snap Inc. | Geo-fence selection system |
US10524088B2 (en) | 2018-03-06 | 2019-12-31 | Snap Inc. | Geo-fence selection system |
US10327096B1 (en) | 2018-03-06 | 2019-06-18 | Snap Inc. | Geo-fence selection system |
US11044574B2 (en) | 2018-03-06 | 2021-06-22 | Snap Inc. | Geo-fence selection system |
US11570572B2 (en) | 2018-03-06 | 2023-01-31 | Snap Inc. | Geo-fence selection system |
US11491393B2 (en) | 2018-03-14 | 2022-11-08 | Snap Inc. | Generating collectible items based on location information |
US10933311B2 (en) | 2018-03-14 | 2021-03-02 | Snap Inc. | Generating collectible items based on location information |
US11163941B1 (en) | 2018-03-30 | 2021-11-02 | Snap Inc. | Annotating a collection of media content items |
US11297463B2 (en) | 2018-04-18 | 2022-04-05 | Snap Inc. | Visitation tracking system |
US10681491B1 (en) | 2018-04-18 | 2020-06-09 | Snap Inc. | Visitation tracking system |
US10219111B1 (en) | 2018-04-18 | 2019-02-26 | Snap Inc. | Visitation tracking system |
US10448199B1 (en) | 2018-04-18 | 2019-10-15 | Snap Inc. | Visitation tracking system |
US10779114B2 (en) | 2018-04-18 | 2020-09-15 | Snap Inc. | Visitation tracking system |
US11683657B2 (en) | 2018-04-18 | 2023-06-20 | Snap Inc. | Visitation tracking system |
US10924886B2 (en) | 2018-04-18 | 2021-02-16 | Snap Inc. | Visitation tracking system |
US11860888B2 (en) | 2018-05-22 | 2024-01-02 | Snap Inc. | Event detection system |
US11367234B2 (en) | 2018-07-24 | 2022-06-21 | Snap Inc. | Conditional modification of augmented reality object |
US11670026B2 (en) | 2018-07-24 | 2023-06-06 | Snap Inc. | Conditional modification of augmented reality object |
US10789749B2 (en) | 2018-07-24 | 2020-09-29 | Snap Inc. | Conditional modification of augmented reality object |
US10679393B2 (en) | 2018-07-24 | 2020-06-09 | Snap Inc. | Conditional modification of augmented reality object |
US10943381B2 (en) | 2018-07-24 | 2021-03-09 | Snap Inc. | Conditional modification of augmented reality object |
US11676319B2 (en) | 2018-08-31 | 2023-06-13 | Snap Inc. | Augmented reality anthropomorphtzation system |
US11450050B2 (en) | 2018-08-31 | 2022-09-20 | Snap Inc. | Augmented reality anthropomorphization system |
US10997760B2 (en) | 2018-08-31 | 2021-05-04 | Snap Inc. | Augmented reality anthropomorphization system |
US11704005B2 (en) | 2018-09-28 | 2023-07-18 | Snap Inc. | Collaborative achievement interface |
US11455082B2 (en) | 2018-09-28 | 2022-09-27 | Snap Inc. | Collaborative achievement interface |
US11799811B2 (en) | 2018-10-31 | 2023-10-24 | Snap Inc. | Messaging and gaming applications communication platform |
US11558709B2 (en) | 2018-11-30 | 2023-01-17 | Snap Inc. | Position service to determine relative position to map features |
US11199957B1 (en) | 2018-11-30 | 2021-12-14 | Snap Inc. | Generating customized avatars based on location information |
US11812335B2 (en) | 2018-11-30 | 2023-11-07 | Snap Inc. | Position service to determine relative position to map features |
US11698722B2 (en) | 2018-11-30 | 2023-07-11 | Snap Inc. | Generating customized avatars based on location information |
US11877211B2 (en) | 2019-01-14 | 2024-01-16 | Snap Inc. | Destination sharing in location sharing system |
US11751015B2 (en) | 2019-01-16 | 2023-09-05 | Snap Inc. | Location-based context information sharing in a messaging system |
US11294936B1 (en) | 2019-01-30 | 2022-04-05 | Snap Inc. | Adaptive spatial density based clustering |
US11693887B2 (en) | 2019-01-30 | 2023-07-04 | Snap Inc. | Adaptive spatial density based clustering |
US11809624B2 (en) | 2019-02-13 | 2023-11-07 | Snap Inc. | Sleep detection in a location sharing system |
US11954314B2 (en) | 2019-02-25 | 2024-04-09 | Snap Inc. | Custom media overlay system |
US11500525B2 (en) | 2019-02-25 | 2022-11-15 | Snap Inc. | Custom media overlay system |
US11574431B2 (en) | 2019-02-26 | 2023-02-07 | Snap Inc. | Avatar based on weather |
US11301117B2 (en) | 2019-03-08 | 2022-04-12 | Snap Inc. | Contextual information in chat |
US11868414B1 (en) | 2019-03-14 | 2024-01-09 | Snap Inc. | Graph-based prediction for contact suggestion in a location sharing system |
US11852554B1 (en) | 2019-03-21 | 2023-12-26 | Snap Inc. | Barometer calibration in a location sharing system |
US11249614B2 (en) | 2019-03-28 | 2022-02-15 | Snap Inc. | Generating personalized map interface with enhanced icons |
US11740760B2 (en) | 2019-03-28 | 2023-08-29 | Snap Inc. | Generating personalized map interface with enhanced icons |
US11361493B2 (en) | 2019-04-01 | 2022-06-14 | Snap Inc. | Semantic texture mapping system |
US11206615B2 (en) | 2019-05-30 | 2021-12-21 | Snap Inc. | Wearable device location systems |
US11785549B2 (en) | 2019-05-30 | 2023-10-10 | Snap Inc. | Wearable device location systems |
US11606755B2 (en) | 2019-05-30 | 2023-03-14 | Snap Inc. | Wearable device location systems architecture |
US11917495B2 (en) | 2019-06-07 | 2024-02-27 | Snap Inc. | Detection of a physical collision between two client devices in a location sharing system |
US11601783B2 (en) | 2019-06-07 | 2023-03-07 | Snap Inc. | Detection of a physical collision between two client devices in a location sharing system |
US11714535B2 (en) | 2019-07-11 | 2023-08-01 | Snap Inc. | Edge gesture interface with smart interactions |
US11821742B2 (en) | 2019-09-26 | 2023-11-21 | Snap Inc. | Travel based notifications |
US11218838B2 (en) | 2019-10-31 | 2022-01-04 | Snap Inc. | Focused map-based context information surfacing |
US11729343B2 (en) | 2019-12-30 | 2023-08-15 | Snap Inc. | Including video feed in message thread |
US11429618B2 (en) | 2019-12-30 | 2022-08-30 | Snap Inc. | Surfacing augmented reality objects |
US11128715B1 (en) | 2019-12-30 | 2021-09-21 | Snap Inc. | Physical friend proximity in chat |
US11893208B2 (en) | 2019-12-31 | 2024-02-06 | Snap Inc. | Combined map icon with action indicator |
US11343323B2 (en) | 2019-12-31 | 2022-05-24 | Snap Inc. | Augmented reality objects registry |
US11943303B2 (en) | 2019-12-31 | 2024-03-26 | Snap Inc. | Augmented reality objects registry |
US11228551B1 (en) | 2020-02-12 | 2022-01-18 | Snap Inc. | Multiple gateway message exchange |
US11888803B2 (en) | 2020-02-12 | 2024-01-30 | Snap Inc. | Multiple gateway message exchange |
US11765117B2 (en) | 2020-03-05 | 2023-09-19 | Snap Inc. | Storing data based on device location |
US11516167B2 (en) | 2020-03-05 | 2022-11-29 | Snap Inc. | Storing data based on device location |
US11619501B2 (en) | 2020-03-11 | 2023-04-04 | Snap Inc. | Avatar based on trip |
US11915400B2 (en) | 2020-03-27 | 2024-02-27 | Snap Inc. | Location mapping for large scale augmented-reality |
US11776256B2 (en) | 2020-03-27 | 2023-10-03 | Snap Inc. | Shared augmented reality system |
US11430091B2 (en) | 2020-03-27 | 2022-08-30 | Snap Inc. | Location mapping for large scale augmented-reality |
US11503432B2 (en) | 2020-06-15 | 2022-11-15 | Snap Inc. | Scalable real-time location sharing framework |
US11290851B2 (en) | 2020-06-15 | 2022-03-29 | Snap Inc. | Location sharing using offline and online objects |
US11483267B2 (en) | 2020-06-15 | 2022-10-25 | Snap Inc. | Location sharing using different rate-limited links |
US11314776B2 (en) | 2020-06-15 | 2022-04-26 | Snap Inc. | Location sharing using friend list versions |
US11676378B2 (en) | 2020-06-29 | 2023-06-13 | Snap Inc. | Providing travel-based augmented reality content with a captured image |
US11943192B2 (en) | 2020-08-31 | 2024-03-26 | Snap Inc. | Co-location connection service |
US11961116B2 (en) | 2020-10-26 | 2024-04-16 | Foursquare Labs, Inc. | Determining exposures to content presented by physical objects |
US11601888B2 (en) | 2021-03-29 | 2023-03-07 | Snap Inc. | Determining location using multi-source geolocation data |
US11606756B2 (en) | 2021-03-29 | 2023-03-14 | Snap Inc. | Scheduling requests for location data |
US11902902B2 (en) | 2021-03-29 | 2024-02-13 | Snap Inc. | Scheduling requests for location data |
US11645324B2 (en) | 2021-03-31 | 2023-05-09 | Snap Inc. | Location-based timeline media content system |
US20220365632A1 (en) * | 2021-05-17 | 2022-11-17 | Apple Inc. | Interacting with notes user interfaces |
US11829834B2 (en) | 2021-10-29 | 2023-11-28 | Snap Inc. | Extended QR code |
US11962645B2 (en) | 2022-06-02 | 2024-04-16 | Snap Inc. | Guided personal identity based actions |
US11963105B2 (en) | 2023-02-10 | 2024-04-16 | Snap Inc. | Wearable device location systems architecture |
US11961196B2 (en) | 2023-03-17 | 2024-04-16 | Snap Inc. | Virtual vision system |
Also Published As
Publication number | Publication date |
---|---|
JP2009025920A (en) | 2009-02-05 |
US8112716B2 (en) | 2012-02-07 |
JP5184832B2 (en) | 2013-04-17 |
US9389746B2 (en) | 2016-07-12 |
US20120102430A1 (en) | 2012-04-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8112716B2 (en) | Information processing apparatus and control method thereof, and computer program | |
JP5188132B2 (en) | Display method of data processing apparatus | |
JP4086050B2 (en) | Information management program and information management apparatus | |
EP2677408A1 (en) | Electronic device, display method, and program | |
US20100162163A1 (en) | Image magnification | |
JP4397347B2 (en) | Input device | |
EP2606419B1 (en) | Touch-sensitive electronic device | |
US9082050B2 (en) | Computer-readable medium storing image processing program and image processing apparatus for improving the operability of image arrangement in a print preview screen | |
JP5893456B2 (en) | Display control apparatus, control method therefor, program, and storage medium | |
JP5936298B2 (en) | Display control apparatus, display control method, and program | |
US6993709B1 (en) | Smart corner move snapping | |
US8928919B2 (en) | Computer-readable medium storing image processing program and image processing apparatus | |
JP6248462B2 (en) | Information processing apparatus and program | |
US9785333B2 (en) | Display device, image processing apparatus, non-transitory computer readable medium, and display control method | |
JP5457765B2 (en) | Information processing apparatus and control method thereof | |
US9632697B2 (en) | Information processing apparatus and control method thereof, and non-transitory computer-readable medium | |
US9292185B2 (en) | Display device and display method | |
US20170038953A1 (en) | Display apparatus and display method for displaying main data and data related to that main data, and a memory medium | |
JP7130514B2 (en) | Information processing device and its control method and program | |
JP7447494B2 (en) | Display device and display control program | |
JP6365268B2 (en) | Display device, image forming apparatus, display method, and display program | |
JP2017215857A (en) | Display, display method, and program | |
JP2004094385A (en) | Area selecting system and method for image inputting device, and its program | |
US11947787B2 (en) | Image processing apparatus, image processing method, and storage medium | |
US11900044B2 (en) | Display method and display apparatus for displaying page image and thumbnail images of page image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: CANON KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KOBAYASHI, KIWAMU;REEL/FRAME:021296/0775 Effective date: 20080708 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20200207 |