US8112716B2 - Information processing apparatus and control method thereof, and computer program - Google Patents
Information processing apparatus and control method thereof, and computer program Download PDFInfo
- Publication number
- US8112716B2 US8112716B2 US12/170,994 US17099408A US8112716B2 US 8112716 B2 US8112716 B2 US 8112716B2 US 17099408 A US17099408 A US 17099408A US 8112716 B2 US8112716 B2 US 8112716B2
- Authority
- US
- United States
- Prior art keywords
- window
- instruction
- display contents
- border
- scrolled
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
Definitions
- the present invention relates to an information processing apparatus and control method thereof, and a computer program.
- an information processing system which can simultaneously execute a plurality of applications with user interfaces, can display a plurality of windows corresponding to the applications at the same time, and can control the respective windows to serve as independent user interfaces.
- the information processing system can display the plurality of windows by one of the following methods.
- a method of overlapping the windows at arbitrary locations according to the rule of a predetermined priority order upon displaying the respective windows is available (overlap method).
- a method of tiling the windows without overlapping each other upon displaying the respective windows is available (tiling window method).
- the overlap method is more effective.
- windows allow modification of their sizes and locations in the X and Y directions independently or simultaneously.
- the windows need to be moved or resized to avoid completely covered windows as a result of window overlap.
- a display controller of an information processing apparatus displays a window to be prioritized or a window selected by the user for access in front of all other windows in each case.
- the whole area of the window displayed in front of all other windows is displayed, and partial areas of other windows are displayed based on their overlapping states.
- This operation includes that of switching display to locate a desired window in front of all other windows and that of downsizing or moving the windows located in front of the target window.
- a window is resized by dragging one border or corner of the window.
- the window is moved by dragging a specific region which is not used for resizing.
- Upon resizing a window there is a specification prepared in advance for each window type, and display control upon resizing is performed based on the specification. More specifically, a specification that moves the display contents upon dragging when a window is resized by dragging one border or corner is available. Also, a specification that does not move the display contents irrespective of dragging is available. Furthermore, a specification that moves the display contents to have a predetermined ratio with respect to dragging or reduces or modifies them is available.
- FIG. 21 shows the configuration of a window to be displayed on a display device.
- FIG. 21 shows a window that displays a document.
- FIGS. 22A to 22D and FIGS. 23A to 23D are explanatory views of popular display control methods upon resizing a window.
- FIGS. 22A to 22D are views showing cases in which the window shown in FIG. 21 is resized by dragging one of the four borders.
- FIGS. 22A and 22C show the cases in which the window size is reduced by moving the right or bottom border. In these cases, the display contents near the border to be moved are gradually hidden.
- FIGS. 22B and 22D show the cases in which the window size is reduced by moving the left or top border. In these cases, the display contents near the right or bottom border opposite to the border to be moved are gradually hidden.
- FIGS. 23A to 23D show cases in which the window shown in FIG. 21 is resized by dragging the corners of the window. Note that the corners of the window mean the intersections of the respective borders that define the window.
- the concept of the display control shown in FIGS. 22A to 22D and FIGS. 23A to 23D is to basically preferentially display the left and up directions of the display contents of a window.
- many windows which aim at the drawing function and display of general figures do not always preferentially display the left and up directions, and different specifications are determined in advance for respective window types.
- Most windows have scroll bars to shift the position of the display contents.
- the user can move the contents that the user wants to display or access to the position within the window by operating the scroll bar.
- a certain window often configures parent and child windows defined by predetermined specifications so as to prevent related windows from being uneasy to see due to overlap display or to prevent correspondence between the related windows from confusing.
- Embodiments of the present invention provides a technique that allows the user to arbitrarily and intuitively perform an operation for moving a desired part to be prioritized to a predetermined location concurrently with resizing during resizing a window.
- an information processing apparatus comprising, display unit configured to display a window, accepting unit configured to accept a resize instruction of the displayed window together with a scroll instruction indicating whether or not to scroll display contents within the window, and control unit configured to control a size of the window and a scrolling of the display contents within the window based on contents of the resize instruction and the scroll instruction, wherein when the scroll instruction indicates that the display contents are to be scrolled, the control unit changes the window to a size indicated by the resize instruction, and scrolls the display contents according to a change amount of the window, and when the scroll instruction indicates that the display contents are not to be scrolled, the control unit changes the window to a size indicated by the resize instruction, and suppresses a scrolling of the display contents.
- a method of controlling an information processing apparatus comprising, displaying a window on a display unit, accepting a resize instruction of the displayed window together with a scroll instruction indicating whether or not to scroll display contents within the window, and controlling a size of the window and a scrolling of the display contents within the window based on contents of the resize instruction and the scroll instruction, wherein when the scroll instruction indicates that the display contents are to be scrolled, the window is changed to a size indicated by the resize instruction, and the display contents are scrolled according to a change amount of the window, and when the scroll instruction indicates that the display contents are not to be scrolled, the window is changed to a size indicated by the resize instruction, and scrolling of the display contents is suppressed.
- FIG. 1A is a block diagram showing an example of the hardware arrangement of an information processing apparatus according an embodiment of the invention
- FIG. 1B shows an example of the arrangement of a mouse as an example of an operation unit 109 according the embodiment of the invention
- FIG. 1C shows an example of the arrangement of a digital pen and tablet as an example of the operation unit 109 according the embodiment of the invention
- FIG. 2 shows an example of the configuration of a window according the embodiment of the invention
- FIGS. 3A to 3D show display examples when the user locates a cursor on a first or second region of a top border 201 or bottom border 202 of a window and drags it according to the first embodiment of the invention
- FIGS. 4A to 4D show display examples when the user locates the cursor on a first or second region of a left border 203 or right border 204 of a window and drags it according to the first embodiment of the invention
- FIG. 5A is a view for explaining ON/OFF switching of scrolling upon resizing according to the first embodiment of the invention
- FIG. 5B is a view for explaining ON/OFF switching of scrolling upon resizing when the cursor position is changed from the display state of FIG. 5A ;
- FIG. 6 is a flowchart showing an example of window resizing processing according to the first embodiment of the invention.
- FIG. 7A shows an example of a state in which the size of a window 200 matches that of a whole display screen 700 according to the second embodiment of the invention
- FIG. 7B shows an example of a state in which the size of the window 200 changes when the user locates a cursor 701 on a second region 203 b and drags it in the X direction according to the second embodiment of the invention
- FIG. 7C shows an example of a state in which the size of the window 200 changes when the user locates the cursor 701 on a first region 203 a and drags it in the X direction according to the second embodiment of the invention
- FIG. 8A shows an example of a state in which the size of the window 200 matches that of the whole display screen 700 according to the second embodiment of the invention
- FIG. 8B shows an example of a state in which the size of the window 200 changes when the user locates the cursor 701 on a second region 202 b and drags it in the Y direction according to the second embodiment of the invention
- FIG. 8C shows an example of a state in which the size of the window 200 changes when the user locates the cursor 701 on a first region 202 a and drags it in the Y direction according to the second embodiment of the invention
- FIG. 9A shows an example of a state before the beginning of dragging when the user locates a cursor P on a corner 209 (P 0 ) in a display control method according to the third embodiment of the invention
- FIG. 9B shows an example of a state in which the user moves the cursor P from the position P 0 on the corner 209 to a position P 1 in the display control method according to the third embodiment of the invention
- FIG. 9C shows an example of a state in which the user moves the cursor P from P 1 to P 2 in the display control method according to the third embodiment of the invention.
- FIG. 10 is a flowchart showing an example of window resizing processing according to the third embodiment of the present invention.
- FIG. 11 shows an example of a window including a plurality of sub-windows
- FIG. 12 is a view for explaining the fourth embodiment of the invention taking as an example a window which is divided into left and right sub-windows as first and second sub-windows;
- FIG. 13 shows an example of a change in display contents when the user moves a boundary in a window divided by one boundary according to the fourth embodiment of the invention
- FIG. 14 is a flowchart showing an example of window resizing processing according to the fourth embodiment of the invention.
- FIGS. 15A and 15B show division examples of boundaries
- FIG. 16 shows an example of display contents of a window according to the fifth embodiment of the invention.
- FIGS. 17A and 17B show display examples when the user resizes (reduces) the window by dragging one border of the window according to the fifth embodiment of the invention
- FIGS. 18A and 18B show display examples when the user resizes the window by dragging one corner of the window according to the fifth embodiment of the invention
- FIGS. 19A and 19B show display examples that allow a normally hidden part to be easier to see according to the fifth embodiment of the invention.
- FIG. 20 is a flowchart showing an example of window resizing processing corresponding to the display examples shown in FIGS. 17A and 17B ;
- FIG. 21 shows the configuration of a window displayed on a display device
- FIGS. 22A to 22D show cases in which the user resizes the window shown in FIG. 21 by dragging one of four borders;
- FIGS. 23A to 23D show cases in which the user resizes the window shown in FIG. 21 by dragging one of four corners of the window.
- the present invention provides a technique which arbitrarily controls, concurrently with dragging, whether or not to scroll the display contents of a window in response to dragging upon resizing the window by dragging an element (border, corner, boundary, etc.) which configures the window.
- the present invention proposes the following three control techniques.
- the first control technique covers a case in which the user resizes a window by mainly dragging one border of the window.
- This technique is characterized in that a direction component, which is not directly related to resizing, of those of the cursor motion upon movement is used in control.
- two different regions are formed on each border of a window, and ON/OFF of scrolling can be controlled concurrently with dragging by selecting that region while dragging.
- the second control technique is characterized in that ON/OFF of scrolling is controlled by operating a button other than that for dragging of an operation unit upon making a drag movement. This technique can be applied to both a case of dragging a corner and that of dragging a border.
- the third control technique executes control by cooperating the first and second control techniques.
- each sub-window is resized by dragging a boundary of the sub-window.
- this technique can also be applied to a window including many sub-windows.
- the first embodiment of the invention will be described hereinafter. This embodiment will explain an embodiment that relates to the first control technique.
- FIG. 1A is a block diagram showing an example of the hardware arrangement of an information processing apparatus used to implement the present invention.
- a CPU 101 executes an OS, application programs, and the like stored in an HD (hard disk) 103 , and controls to temporarily store information, files, and the like required for execution of the programs in a RAM 102 .
- the RAM 102 serves as a main memory, work area, and the like of the CPU 101 .
- the HD 103 stores the application programs, driver programs, the OS, control programs, a processing program required to execute processing according to this embodiment, and the like.
- a display unit 104 displays information according to commands input from an operation unit 109 , externally acquired information, and the like.
- the display unit 104 may adopt any display method of CRT type, liquid crystal type, PDP type, SED type, and organic EL type.
- the display unit 104 displays a window according to this embodiment.
- a network interface (to be referred to as “I/F” hereinafter) 105 is a communication interface used to connect a network.
- a ROM 106 stores programs such as a basic I/O program and the like.
- An external storage drive 107 can load programs and the like stored in a medium 108 to this computer system.
- the medium 108 as a storage medium stores predetermined programs and related data.
- the operation unit 109 is a user interface used to accept operations and instructions from an operator of this apparatus, and comprises a keyboard, mouse, digital pen, and the like.
- a system bus 110 controls the flow of data in the apparatus.
- a mouse, digital pen, and tablet as examples of the operation unit 109 can have the arrangements shown in, for example, FIGS. 1B and 1C .
- the mouse and tablet are connected to an information processing apparatus 100 using USB connections, and can serve as the operation unit 109 .
- a mouse 120 shown in FIG. 1B can constitute a part of the operation unit 109 .
- the mouse 120 has the left button 121 and the right button 122 .
- the bottom surface of the mouse 120 comprises a structure for detecting a moving amount and direction of the mouse 120 using a mechanical mechanism using a ball or an optical mechanism using an optical sensor.
- a digital pen 130 and tablet 140 shown in FIG. 1C can constitute a part of the operation unit 109 .
- the digital pen 130 can comprise a tip switch 131 at the pen tip, and a side switch 132 on the side surface.
- the tip switch 131 corresponds to the left button 121 of the mouse 120
- the side switch 132 corresponds to the right button 122 of the mouse 120 .
- the tip switch 131 can be turned on by pressing it against the tablet 140 .
- the side switch 132 can be turned on when the operator holds it down with the finger.
- the tablet 140 comprises a pressure-sensitive or electrostatic contact sensor, and can detect the position of the digital pen 130 when the tip of the digital pen 130 is pressed against the tablet 140 .
- the tablet 140 can detect the moving direction and amount of the digital pen 130 .
- the tablet 140 may be integrated with the display unit 104 .
- FIG. 2 shows an example of the configuration of a window according to the embodiment of the invention.
- a window 200 has a rectangular shape, and is defined by four borders, that is, a top border 201 , bottom border 202 , left border 203 , and right border 204 .
- the window 200 has four corners 207 , 208 , 209 , and 210 .
- the corner 207 is defined as an intersection between the top border 201 and left border 203
- the corner 208 is defined as an intersection between the left border 203 and bottom border 202
- the corner 209 is defined as an intersection between the bottom border 202 and right border 204
- the corner 210 is defined as an intersection between the right border 204 and top border 201 .
- each border is divided into two different regions, that is, first and second region. More specifically, the first region is located to include the center of the border, and the second region is located to include the end portions of the border, and to sandwich the first region. For example, on the top border 201 , a first region 201 a including the center of the border is located to be sandwiched between second regions 201 b including the end portions of the border.
- each border may be equally divided into three or the first region may be slightly longer or shorter than the length obtained when the border is equally divided into three regions. This embodiment will exemplify a case in which one border is equally divided into three regions.
- the window 200 includes a title bar 205 and display area 206 .
- the title bar 205 displays information corresponding to the content displayed in the display area 206 .
- the title bar 205 displays a document name.
- the display area 206 displays the contents of data to be displayed.
- the display area 206 displays the contents of a document for a document file, or displays a corresponding image or graphic information for an image or graphic file.
- the window can be resized by dragging one of the four borders of the window based on the operation of the operation unit 109 , and moving the selected border in a direction perpendicular to that border. That is, in this embodiment, the drag operation corresponds to a window resize instruction operation.
- the window resize instruction including a scroll instruction indicating whether or not to scroll the display contents within the window, is accepted.
- this embodiment uses “drag” as a term that represents the concept to be described below.
- the mouse shown in FIG. 1B is used as the operation unit 109 to have default settings of Microsoft Windows®.
- the display position of a cursor displayed on the screen of the display unit 104 is controlled in response to the movement of the mouse 120 .
- the user presses the left button 121 while the cursor is located on a target to be selected that target to be selected is highlighted.
- moving the cursor by moving the mouse 120 in this state will be referred to as “dragging”.
- FIGS. 3A to 3D show display examples according to this embodiment when the user drags the cursor while locating it on the first or second region of the top border 201 or bottom border 202 of the window in the display state of FIG. 2 .
- FIGS. 4A to 4D show display examples according to this embodiment when the user drags the cursor while locating it on the first or second region of the left border 203 or right border 204 of the window in the display state of FIG. 2 .
- FIGS. 3A and 3C and FIGS. 4A and 4C it can also be considered as if the display contents were moving in correspondence with the movement of the border.
- such change in display contents will be referred to as “resizing with scrolling”.
- a state in which the display contents of the display area 206 are moved and displayed in correspondence with the movement of the border will be referred to as “with scrolling”, “the display contents are scrolled”, or “scrolling the display contents”.
- the display contents near a border (first border) where the cursor is located are changed. More specifically, the display contents are changed so as to be hidden in turn by the first border. On the other hand, the display contents near a border (second border) opposite to the border (first border) where the cursor is located remain unchanged.
- FIGS. 3B and 3D and FIGS. 4B and 4D it can also be considered as if the display contents are fixed with respect to the movement of the border. In this embodiment, such change in display contents will be referred to as “resizing without scrolling”.
- a state in which the display contents on the display area 206 are fixedly displayed with respect to the whole display screen will be referred to as “without scrolling”, “the display contents are not scrolled”, or “not scrolling the display contents”.
- “resizing with scrolling” and “resizing without scrolling” can be executed during resizing in a continuous drag operation. That is, the resizing with scrolling and that without scrolling can be switched in real time during a continuous, single drag operation. Hence, the user can resize the window while adjusting the display position.
- FIGS. 5A and 5B are views for explaining that switching according to this embodiment.
- the width and height directions of the window 200 respectively match the X and Y directions of an X-Y coordinate system 502 set on the display screen where the window 200 is displayed.
- P 0 represents an initial position of the cursor.
- FIG. 5B expresses a state in which a position P(Px, Py) of the cursor is continuously changed like P 0 ⁇ P 1 ⁇ P 2 ⁇ P 3 or P 5 ⁇ P 6 ⁇ P 7 ⁇ P 8 during a single drag operation.
- P(Px, Py) is a coordinate value based on the X-Y coordinate system 502 set on the display screen.
- ⁇ Cx and ⁇ Px are differences between Cx and Px at the beginning of the resizing with scrolling, and Cx and Px after the window is resized. Note that these differences correspond to change amounts of the window 200 in the X direction.
- the “resizing with scrolling” and “resizing without scrolling” are executed concurrently according to a change in position of the cursor in the Y direction.
- FIG. 6 is a flowchart showing an example of the window resizing processing according to the first embodiment.
- the processing corresponding to the flowchart shown in FIG. 6 is implemented when the CPU 101 reads out a corresponding processing program stored in the HD 103 onto the RAM 102 and executes that program to control respective components.
- FIG. 6 describes a case wherein the user resizes the window by dragging the left border 203 of the window 200 .
- the embodiment of the invention is not limited to the case wherein the left border 203 is dragged. That is, the same processing as in FIG. 6 can resize the window by dragging the top border 201 , bottom border 202 , and right border 204 .
- step S 601 the CPU 101 acquires operation information (information of a first instruction operation) of a first button of the mouse 120 or digital pen 130 of the operation unit 109 , and information (moving information) of the moving direction and amount of the mouse 120 or digital pen 130 .
- the first button (first operation unit) corresponds to the left button 121 of the mouse 120 if the mouse 120 is used in the default settings of Microsoft Windows®.
- the first button corresponds to the tip switch 131 at the pen tip of the digital pen 130 .
- step S 603 the CPU 101 calculates the position coordinate of the cursor (cursor position coordinate) based on the moving amount information acquired in step S 601 to determine on which border of the window 200 the cursor is located. This determination process can be attained by seeing which of predetermined regions set based on the first and second regions of the borders that configure the window 200 includes the cursor position coordinate.
- step S 603 If it is determined that the cursor is located on the left border 203 of the window 200 (“left border” in step S 603 ), it can be determined that the user begins to drag the left border 203 . In this case, the process advances to step S 604 . On the other hand, if the cursor is located on one of the remaining borders (on one of the top border 201 , bottom border 202 , and right border 204 ) (“another border” in step S 603 ), it can be determined that the user begins to drag another border. In this case, the process advances to step S 605 . In step S 605 , the CPU 101 executes window resizing processing by dragging of another border.
- step S 604 the CPU 101 determines the cursor position coordinate P(Px, Py) at the beginning of dragging, as shown in FIG. 5A , for the window which begins to be dragged. Also, the CPU 101 determines the position C(Cx, Cy) of the corner 208 at the lower end of the left border 203 and the position Q(Qx, Qy) of the arbitrary display contents, as shown in FIG. 5B .
- step S 606 the CPU 101 further acquires the operation information of the first button and the moving amount information, and updates the cursor position coordinate P(Px, Py) based on the moving amount information.
- the CPU 101 determines in step S 607 whether or not the first button is kept ON. If the first button is not kept ON but is switched to OFF (“NO” in step S 607 ), this processing ends. In this case, a so-called “drop” operation is made.
- step S 608 the CPU 101 sets the X position (Cx) of the left border 203 of the window 200 to match the X component (Px) of the cursor position coordinate updated in step S 606 . In this way, the position of the left border 203 follows the movement of the cursor in the X direction.
- the CPU 101 determines in step S 609 based on the cursor position coordinate updated in step S 606 whether or not the cursor is located on the first region. If it is determined that the cursor is located on the first region (“YES” in step S 609 ), the process advances to step S 610 . On the other hand, if it is determined that the cursor is located on the second region (“NO” in step S 609 ), the process advances to step S 611 .
- step S 610 the CPU 101 sets the moving amount ⁇ Qx of the position Q of the arbitrary display contents in the X direction to be equal to the moving amount ⁇ Px of the cursor in the X direction, so as to scroll the display contents upon resizing the window.
- step S 611 the CPU 101 sets the moving amount ⁇ Qx to be zero so as to suppress scrolling of the display contents upon resizing the window.
- step S 612 the CPU 101 updates display of the cursor and window 200 based on the position of the left border 203 determined in step S 608 and the moving amount ⁇ Qx determined in step S 610 or S 611 . After that, the process returns to step S 606 to continue the processing.
- step S 606 represents cursor movement during dragging, that is, that dragging is continued and resizing of the window is in progress during this loop.
- this represents that the drop operation is made to settle the window size.
- each border of the window is divided into two different regions, and the change method of the display contents within the window can be controlled based on the selected region. Since the region can be selected in real time during resizing of the window, the position of the display contents within the window can be controlled simultaneously with resizing. In this way, a desired display result can be obtained by a series of operations, thus improving the work efficiency.
- the display states 1 and 3 will be compared. In case of the display state 1 , since the window itself is fixed, there is no trouble upon handling the window. However, in order to refer to another window, a switching operation for canceling the full screen display state is required.
- window display of the first embodiment is applied to so-called “full screen display”.
- a window is maximized in the X and Y directions of the display screen of the display unit 104 , and is fixed in size.
- the window cannot be resized unless the full screen display state is canceled.
- a window is maximized in only one of the X and Y directions within the display screen, and is fixed in size in that direction.
- one border is fixed to the end of the display screen, and only the other border is movable by dragging. By operating this border that can be dragged, the window can be resized in one direction.
- FIGS. 7A to 7C show examples of full screen display according to this embodiment.
- reference numeral 700 denotes a whole display screen of the display unit 104 . Since the window configuration is the same as that in FIG. 2 of the first embodiment, corresponding reference numerals will be used.
- a left border 203 of a window 200 includes first region 203 a and second regions 203 b . The user can drag the first and second regions 203 a and 203 b using a cursor 701 .
- the directions of the whole display screen 700 and window 200 are determined based on an X-Y coordinate system 502 .
- FIG. 7A shows a state in which the size of the window 200 matches that of the whole display screen 700 . That is, FIG. 7A corresponds to the full screen display state.
- FIG. 7B shows a state in which the window 200 is resized when the user locates the cursor 701 on the second region 203 b and drags it in the X direction.
- the size of the window 200 changes in only the X direction.
- a right border 204 opposite to the dragged left border 203 is fixed to the end of the display area, and only the left border 203 can be dragged.
- the window is resized in one direction.
- the window 200 is fixed in a maximum size in the Y direction perpendicular to the dragging direction. Note that in case of FIG. 7B , since the second region 203 b is used, resizing without scrolling described in the first embodiment is executed.
- FIG. 7C shows a state in which the window is resized when the user locates the cursor 701 on the first region 203 a and drags it in the X direction.
- the size of the window 200 changes in only the X direction.
- the right border 204 opposite to the dragged left border 203 is fixed to the end of the display area, and only the left border 203 can be dragged.
- the window 200 is resized in one direction. Note that the window 200 is fixed in a maximum size in the Y direction perpendicular to the dragging direction. Note that resizing with scrolling described in the first embodiment is executed since the first region 203 a is used at this time.
- FIGS. 7A to 7C the left border 203 is used as a border having a function of resizing the window.
- any of the remaining three borders which configure the window 200 may be used as a border having a function of resizing the window.
- FIGS. 8A to 8C show a case using a bottom border 202 . That is, FIG. 8A shows an example of a state in which the size of the window 200 according to this embodiment matches that of the whole display screen 700 .
- FIG. 8B shows an example of a state in which the window 200 is resized when the user locates the cursor 701 on a second region 202 b and drags it in the Y direction according to this embodiment.
- FIGS. 8A to 8C shows an example of a state in which the window is resized when the user locates the cursor 701 on a first region 202 a and drags it in the Y direction according to this embodiment.
- FIGS. 8A to 8C shows an example of a state in which the window is resized when the user locates the cursor 701 on a first region 202 a and drags it in the Y direction according to this embodiment.
- FIGS. 8A to 8C shows an example of a state in which the window is resized when the user locates the cursor 701 on a first region 202 a and drags it in the Y direction according to this embodiment.
- FIGS. 8A to 8C shows an example of a state in which the window is resized when the user locates the cursor 701 on a first region 202 a and drags it in the Y direction according to this embodiment.
- a border that is movable can also be referred to as a “movable border”
- a border located at a position opposite to the movable border can also be referred to as a “first fixed border (opposing fixed border)”
- the remaining two borders can also be referred to as a “second fixed border” and “third fixed border”.
- the left border 203 corresponds to the movable border
- the right border 204 corresponds to the first fixed border (opposing fixed border)
- a top border 201 and the bottom border 202 respectively correspond to the second and third fixed borders.
- the bottom border 202 corresponds to the movable border
- the top border 201 corresponds to the first fixed border (opposing fixed border)
- the left and right borders 203 and 204 respectively correspond to the second and third fixed borders.
- the display position on a display area 206 of the window 200 can be controlled in the same manner as in the first embodiment. However, an only difference is that the first and second regions given to all the four borders in the first embodiment are limited to only one border in this embodiment.
- the window according to this embodiment is maintained in a maximized state in one of the X and Y directions (width and height directions). Therefore, upon reordering a plurality of windows, a one-dimensional positional relationship need only be considered. As a result, compared to reordering of windows in consideration of a two-dimensional positional relationship, an operation can be simplified very much, thus greatly eliminating complexity.
- the window can be resized, a window hidden below the upper window can be displayed compared to a case in which a window is completely maximized in both the X and Y directions, thus improving convenience.
- window can be defined as a fourth window display state in addition to the aforementioned window display states 1 to 3 .
- the point of this embodiment is not limited to that the window can be resized in one direction in the full screen display state, but it lies in that the display position of the display contents within the window can be controlled at the time of the drag operation in combination with the invention according to the first embodiment.
- the aforementioned first embodiment has proposed the display control method upon resizing the window by dragging one of the borders which configure the window.
- This method is effective in the case in which the window is often resized by mainly dragging the border.
- this method is very effective for the window which is maximized in only one direction, as described in the second embodiment.
- a normal window can be resized by dragging one of its corners, as shown in FIGS. 23A to 23D . Whether each user drags the border or corner to resize such normal window depends on favor of the user, the display contents of individual applications, individual work contents, and the like.
- This embodiment proposes a method that can control ON/OFF of scrolling during resizing in real time as in the first embodiment even upon resizing a window by dragging its corner.
- this embodiment uses ON/OFF of a second button of a mouse 120 or digital pen 130 of an operation unit 109 in switching control between resizing with scrolling and that without scrolling upon resizing a window.
- the second button (second operation unit) corresponds to a right button 122 of the mouse 120 in the default settings of Microsoft Windows®.
- the second button corresponds to a side switch 132 on the side surface of the digital pen 130 .
- the second button may be assigned to a specific key such as a control key.
- FIG. 9A shows a state before the beginning of dragging, in which the user locates a cursor P on a corner 209 (P 0 ).
- FIG. 9B shows a state in which the user moves the cursor P from the position P 0 to a position P 1 of the corner 209 .
- the user turns on the second button to execute the resizing with scrolling.
- FIG. 9C shows a state in which the user moves the cursor P from P 1 to P 2 . Upon this cursor movement, the user turns off the second button to execute the resizing without scrolling.
- a dotted line 901 in FIGS. 9B and 9C indicates the size of a window 200 before resizing.
- the contents within a dotted line 902 indicate the display contents falling outside the window 200 after resizing.
- the first button is kept ON during dragging irrespective of ON/OFF of the second button.
- FIGS. 9A to 9C are views for explaining the display control method of this embodiment by adopting the configuration of the window corresponding to FIG. 2 , but they omit descriptions of first and second regions for the sake of simplicity. Note that the third embodiment can be practiced in combination with the first embodiment, and this embodiment can be applied to the window shown in FIG. 2 , which has the first and second regions, just in case.
- This embodiment can assure similar operations on any of four corners 207 to 210 of the window 200 , and the following description will be given taking as an example a case in which the user drags the lower right corner 209 .
- the position C changes like C 0 , C 1 , and C 2
- the position B changes like B 0 , B 1 , and B 2
- the position Q changes like Q 0 , Q 1 , and Q 2 in correspondence with the movement of the cursor position from P 0 to P 1 and to P 2 .
- FIG. 10 is a flowchart showing an example of the window resizing processing according to the third embodiment.
- the processing corresponding to the flowchart shown in FIG. 10 is implemented when a CPU 101 reads out a corresponding processing program stored in an HD 103 onto a RAM 102 and executes that program to control respective components.
- step S 1001 the CPU 101 acquires operation information (information of a first instruction operation) of a first button of the mouse 120 or digital pen 130 of the operation unit 109 , and information (moving information) of the moving direction and amount of the mouse 120 or digital pen 130 .
- the first button corresponds to the left button 121 of the mouse 120 if the mouse 120 is used in the default settings of Microsoft Windows®.
- the first button corresponds to a tip switch 131 at the pen tip of the digital pen 130 .
- the CPU 101 determines in step S 1002 based on the operation information of the first button acquired in step S 1001 whether or not the first button is switched from OFF to ON. If it is determined that the first button is switched to ON (“YES” in step S 1002 ), the process advances to step S 1003 . On the other hand, if it is determined that the first button is kept OFF without being switched to ON (“NO” in step S 1002 ), the process returns to step S 1001 to continue the processing.
- step S 1003 the CPU 101 calculates the position coordinate of the cursor P (cursor position coordinate) based on the moving amount information acquired in step S 1001 to determine on which corner of the window 200 the cursor is located. This determination process can be attained by seeing which of predetermined regions set based on the corners that configure the window 200 includes the cursor position coordinate.
- step S 1003 If it is determined that the cursor is located on the lower right corner 209 of the window 200 (“lower right corner 209 ” in step S 1003 ), it can be determined that the user begins to drag the lower right corner 209 . In this case, the process advances to step S 1004 . On the other hand, if the cursor is located on one of the remaining corners (on one of the corners 207 , 208 , and 210 ) (“another” in step S 1003 ), it can be determined that the user begins to drag another corner. In this case, the process advances to step S 1005 . In step S 1005 , the CPU 101 executes window resizing processing by dragging of another corner.
- step S 1004 the CPU 101 determines the position coordinates P(Px, Py), C(Cx, Cy), B(Bx, By), and Q(Qx, Qy) at the beginning of dragging, as shown in FIG. 9A , for the window which begins to be dragged. Note that the definitions of respective coordinates are the same as those described above.
- step S 1006 the CPU 101 further acquires the information of the first instruction operation and moving amount information, and also operation information of a second button (information of a second instruction operation) of the mouse 120 or digital pen 130 of the operation unit 109 . Also, the CPU 101 updates the cursor position coordinate P(Px, Py) based on the moving amount information. The CPU 101 then determines in step S 1007 whether or not the first button is kept ON. If the first button is not kept ON but is switched to OFF (“NO” in step S 1007 ), this processing ends. In this case, a so-called “drop” operation is made.
- step S 1008 the CPU 101 sets the position C(Cx, Cy) of the lower right corner 209 of the window 200 to match the cursor position P(Px, Py) updated in step S 1006 . In this way, the position of the lower right corner 209 follows the cursor movement.
- the CPU 101 determines in step S 1009 based on the operation information of the second button acquired in step S 1006 whether or not the second button is ON. If it is determined that the second button is ON (“YES” in step S 1009 ), the process advances to step S 1010 . On the other hand, if it is determined that the second button is OFF (“NO” in step S 1009 ), the process advances to step S 1011 .
- step S 1010 the CPU 101 sets the moving amount ⁇ Q( ⁇ QX, ⁇ Qy) of the position Q of the arbitrary display contents to be equal to the moving amount ⁇ P( ⁇ Px, ⁇ Py) of the cursor. In this way, the display contents are scrolled by a size corresponding to the change amounts of the window 200 in the X and Y directions.
- step S 1011 the CPU 101 sets the moving amount ⁇ Q to be (0, 0). In this case, the display contents are not scrolled.
- step S 1012 the CPU 101 updates display of the cursor and window 200 based on the position of the lower right corner 209 determined in step S 1008 and the moving amount ⁇ Q determined in step S 1010 or S 1011 . After that, the process returns to step S 1006 to continue the processing.
- step S 1006 represents cursor movement during dragging, that is, that dragging is continued and resizing of the window is in progress during this loop.
- this represents that the drop operation is made to settle the window size.
- the two different operation buttons of the operation unit 109 are used, and the change method of the display contents within the window can be controlled based on combinations of the button operations. Since the combinations of the button operations can be changed in real time during resizing of the window, the position of the display contents within the window can be controlled simultaneously with resizing. In this way, a desired display result can be obtained by a series of operations, thus improving the work efficiency.
- the window is resized by mainly dragging the corner of the window.
- the display control method according to this embodiment can be applied to a case wherein the window is resized by dragging its border.
- ON/OFF of scrolling upon resizing can be controlled by the same operations in case of dragging the corner and that of dragging the border.
- the first control technique is effective upon attaching importance to resizing by dragging a border, and is especially effective in case of the second embodiment. In consideration of only the case of dragging the border, the first control technique can achieve the desired resizing by a simpler operation than the second control technique.
- the second control technique is effective for the case including probability of dragging of both the corner and border, and the case that also attaches importance to dragging of the corner.
- the desired resizing can be achieved by common operation to the case of dragging the corner and that of dragging the border.
- This embodiment will explain display control of the present invention, which is applied to a case in which a window includes a plurality of sub-windows, and each sub-window is resized by dragging a boundary between the neighboring sub-windows.
- Some applications display using a window defined by a single area, and some other applications display using a window including a plurality of sub-windows.
- FIG. 11 shows an example of the latter application.
- the display efficiency can be improved compared to a case of a single window, and a more comfortable user interface can be provided.
- the left or top part of the display contents in each sub-window is preferentially displayed in some cases. This is based on the same situation as a window defined by a single area, that is, the idea that the first character of a sentence and the first line of a page are to be preferentially displayed.
- this embodiment provides a display control method that allows to concurrently switch ON/OFF of scrolling of sub-windows on two sides of a boundary in real time during resizing upon resizing by dragging the boundary.
- the need for fixing ON/OFF of scrolling in advance can be obviated unlike in the related art.
- Control mode 1 resizing with scrolling of both the sub-windows on the first and second sides
- Control mode 2 resizing with scrolling of the sub-window on the first side and that without scrolling of the sub-window on the second side
- Control mode 3 resizing without scrolling of the sub-window on the first side and that with scrolling of the sub-window on the second side
- Control mode 4 resizing without scrolling of both the sub-windows on the first and second sides
- the relationship between the sub-windows on the first and second sides can be considered as that between neighboring sub-windows on, for example, the left and right sides or the upper and lower sides of the boundary.
- FIG. 12 is a view for explaining this embodiment taking as an example a window which is divided into left and right sub-windows as the first and second sub-windows. Note that the boundary that the user can drag is one boundary per drag operation, and the same display control applies to a window divided into upper and lower sub-windows as in that divided into the left and right sub-windows.
- a window 1200 is defined by borders 1201 , 1202 , 1203 , and 1204 , and has sub-windows 1207 , 1208 , and 1209 partitioned by boundaries 1205 and 1206 .
- Each of the boundaries 1205 and 1206 is divided into two regions.
- the upper half region is called a first region
- the lower half region is called a second region.
- the division method is merely an example, and is not limited to that shown in FIG. 12 .
- the same division method of each border in the first embodiment may be adopted.
- the position of a cursor P can be expressed by P(Px, Py) based on an X-Y coordinate system 502 set on the display screen on which the window 1200 is displayed.
- Let LBy be the length of the boundary 1205 within the window 1200
- BL(BLx, BLy) be the position of an intersection between the lower end of the boundary 1205 and the lower border 1202 .
- condition 1 a condition required to locate the cursor P on the first region is described by: LBy/ 2 ⁇ Py ⁇ BLy ⁇ LBy
- condition 2 a condition required to locate the cursor P on the second region is described by: 0 ⁇ Py ⁇ BLy ⁇ LBy/ 2
- QL(QLX, QLy) be the position of arbitrary display contents within the sub-window 1207 on the left side of the boundary 1205
- QR(QRx, QRy) be the position of arbitrary display contents within the sub-window 1208 on the right side.
- the boundary 1205 will be described below. However, the scroll control of the display contents upon resizing the sub-windows with reference to the boundary 1206 can be similarly executed.
- ⁇ QLx and ⁇ QRx are differences of QLx and QRx before and after resizing of the sub-windows.
- ⁇ Px is a difference of Px before and after resizing of the sub-windows. Note that these differences correspond to the change amounts of the boundary 1205 in the X direction.
- the four types of resizing control of the control modes 1 to 4 are switched by combining dragging of the cursor which is located on either the first or second region, and ON/OFF of the second button operation.
- the cursor located on the first region is dragged, and the second button is ON.
- the cursor located on the first region is dragged, and the second button is OFF.
- the cursor located on the second region is dragged, and the second button is ON.
- the cursor located on the second region is dragged, and the second button is OFF.
- the display control method simultaneously uses control based on the position of the cursor in the Y direction used in the first and second embodiments, and control based on the second button of the operation unit 109 used in the third embodiment in cooperation with each other.
- switching between resizing with scrolling and that without scrolling for each of the sub-windows on the two sides is controlled concurrently during the single, continuous drag operation and cursor movement.
- the start and continuation of dragging are controlled by ON/OFF of the first button of the operation unit 109 as in the above embodiments.
- FIG. 13 shows an example of a change in display contents when the user moves a boundary on a window divided by the single boundary.
- reference numeral 1301 denotes a state before beginning of dragging.
- a left sub-window displays alphabetical letters “ABD”
- a right sub-window displays three rows of numerals “1” to “9”.
- a display state of a window 1303 is set. Since both the left and right sub-windows are scrolled, the display contents near the boundary remain unchanged, but those near the left and right borders of the window are changed.
- a display state of a window 1304 is set. At this time, only the right sub-window is scrolled. Hence, alphabetical letters “FG” hidden on the left sub-window are newly displayed near the boundary. On the other hand, on the right sub-window, numerals “1 2 3” near the right border of the window, which were displayed on the window 1303 , are hidden.
- a display state like a window 1305 is set. At this time, only the left sub-window is scrolled. Hence, on the left sub-window, alphabetical letters “AB” hidden near the left border of the window are displayed. On the other hand, since the right sub-window is not scrolled, numerals “3 4 5 6” are hidden by the boundary.
- FIG. 14 is a flowchart showing an example of the window resizing processing according to the fourth embodiment.
- the processing corresponding to the flowchart shown in FIG. 14 is implemented when a CPU 101 reads out a corresponding processing program stored in an HD 103 onto a RAM 102 and executes that program to control respective components.
- FIG. 14 describes a case in which the user resizes the sub-windows by dragging the boundary 1205 of the window 1200 .
- the embodiment of the invention is not limited to the case in which the boundary 1205 is dragged. That is, the same processing as in FIG. 14 can resize the sub-windows by dragging the boundary 1206 or another boundary.
- step S 1401 the CPU 101 acquires operation information (information of a first instruction operation) of a first button of a mouse 120 or digital pen 130 of the operation unit 109 , and information (moving information) of the moving direction and amount of the mouse 120 or digital pen 130 .
- the first button corresponds to a left button 121 of the mouse 120 if the mouse 120 is used in the default settings of Microsoft Windows®.
- the first button corresponds to a tip switch 131 at the pen tip of the digital pen 130 .
- the CPU 101 determines in step S 1402 based on the operation information of the first button acquired in step S 1401 whether or not the first button is switched from OFF to ON. If it is determined that the first button is switched to ON (“YES” in step S 1402 ), the process advances to step S 1403 . On the other hand, if it is determined that the first button is kept OFF without being switched to ON (“NO” in step S 1402 ), the process returns to step S 1401 to continue the processing.
- step S 1403 the CPU 101 calculates the position coordinate of the cursor P (cursor position coordinate) based on the moving amount information acquired in step S 1401 to determine on which boundary of the window 1200 the cursor is located. This determination process can be attained by seeing which predetermined region set based on the boundaries included in the window 1200 includes the cursor position coordinate.
- step S 1403 If it is determined that the cursor is located on the boundary 1205 of the window 1200 (“boundary 1205 ” in step S 1403 ), it can be determined that the user begins to drag the boundary 1205 . In this case, the process advances to step S 1404 . On the other hand, if the cursor is located on one of the remaining boundaries (on the boundary 1206 or the like) (“another” in step S 1403 ), it can be determined that the user begins to drag another boundary. In this case, the process advances to step S 1405 . In step S 1405 , the CPU 101 executes window resizing processing by dragging of another boundary.
- step S 1404 the CPU 101 determines the position coordinates P(Px, Py), BL(BLx, BLy), QL(QLx, QLy), and QR(QRx, QRy) at the beginning of dragging, as shown in FIG. 12 , for the window which begins to be dragged. Note that the definitions of respective coordinates are the same as those described above.
- step S 1406 the CPU 101 further acquires the information of the first instruction operation and moving amount information, and also operation information of the second button (information of a second instruction operation) of the mouse 120 or digital pen 130 of the operation unit 109 . Also, the CPU 101 updates the cursor position coordinate P(Px, Py) based on the moving amount information. The CPU 101 then determines in step S 1407 whether or not the first button is kept ON. If the first button is not kept ON but is switched to OFF (“NO” in step S 1407 ), this processing ends. In this case, a so-called “drop” operation is made.
- step S 1408 the CPU 101 sets the X component BLx of the end position BL of the boundary 1205 to match the X component Px of the cursor position P updated in step S 1406 . In this way, the position of the boundary 1205 follows the cursor movement.
- the CPU 101 determines in step S 1409 based on the coordinate Py of the cursor position in the Y direction obtained in step 1406 on which of the first and second regions the cursor P is located and based on the operation information of the second button if the second button is ON.
- step S 1410 If the cursor P is located on the first region, and the second button is ON, the process advances to step S 1410 . If the cursor P is located on the first region, and the second button is OFF, the process advances to step S 1411 . Furthermore, if the cursor P is located on the second region, and the second button is ON, the process advances to step S 1412 . Moreover, if the cursor P is located on the second region, and the second button is OFF, the process advances to step 1413 .
- step S 1410 the CPU 101 sets the moving amount ⁇ QLx of the position QL of the arbitrary display contents in the X direction on the left sub-window 1207 as the first side of the boundary 1205 to be equal to the moving amount ⁇ Px of the cursor P in the X direction. Also, the CPU 101 sets the moving amount ⁇ QRx of the position QR of the arbitrary display contents in the X direction on the right sub-window 1208 as the second side of the boundary 1205 to be equal to the moving amount ⁇ Px of the cursor P in the X direction. As a result, the display contents on the sub-windows are scrolled by a size corresponding to the change amount of the boundary 1205 in the X direction.
- step S 1411 the CPU 101 sets the moving amount ⁇ QLx of the position QL of the arbitrary display contents in the X direction on the left sub-window 1207 as the first side of the boundary 1205 to be equal to the moving amount ⁇ Px of the cursor P in the X direction. Also, the CPU 101 sets the moving amount ⁇ QRx of the position QR of the arbitrary display contents in the X direction on the right sub-window 1208 as the second side of the boundary 1205 to be zero. In this way, the display contents on the left sub-window 1207 are scrolled by a size corresponding to the change amount of the boundary 1205 in the X direction. On the other hand, the display contents on the right sub-window 1208 are not scrolled.
- step S 1412 the CPU 101 sets the moving amount ⁇ QLx of the position QL of the arbitrary display contents in the X direction on the left sub-window 1207 as the first side of the boundary 1205 to be zero. Also, the CPU 101 sets the moving amount ⁇ QRx of the position QR of the arbitrary display contents in the X direction on the right sub-window 1208 as the second side of the boundary 1205 to be equal to the moving amount ⁇ Px of the cursor P in the X direction. In this way, the display contents on the left sub-window 1207 are not scrolled. On the other hand, the display contents on the right sub-window 1208 are scrolled by a size corresponding to the change amount of the boundary 1205 in the X direction.
- step S 1413 the CPU 101 sets the moving amount ⁇ QLx of the position QL of the arbitrary display contents in the X direction on the left sub-window 1207 as the first side of the boundary 1205 to be zero. Also, the CPU 101 sets the moving amount ⁇ QRx of the position QR of the arbitrary display contents in the X direction on the right sub-window 1208 as the second side of the boundary 1205 to be zero. In this way, the display contents on the sub-windows 1207 and 1208 are not scrolled.
- step S 1414 the CPU 101 updates displays of the cursor and window 1200 .
- the CPU 101 executes this updating process based on the position BLx of the boundary 1205 determined in step S 1408 , and the moving amounts ⁇ QLx and ⁇ QRx determined in any of steps S 1410 to S 1413 . After that, the process returns to step S 1406 to continue the processing.
- step S 1406 represents cursor movement during dragging, that is, that dragging is continued and resizing of the window is in progress during this loop.
- this represents that the drop operation is made to settle the window size.
- this embodiment has been described. Note that the same display control method according to this embodiment can be applied to not only the window of the configuration shown in FIGS. 12 and 13 but also to a window divided into upper and lower sub-windows. Furthermore, the method of this embodiment can be applied to a window divided into upper, lower, left and right sub-windows, as shown in FIG. 11 .
- the window shown in FIG. 11 is normally configured, so that a boundary which divides the upper and lower sub-windows and that which divides the right and left sub-windows are independently operable. Hence, by executing the same processing as that shown in FIG. 14 in turn to these boundaries, the display control method of this embodiment can be applied.
- the first and second regions are required to be defined on each boundary.
- the length of each boundary may be equally divided.
- a part divided by an intersection of the vertical and horizontal boundaries may be equally divided.
- the lengths of the first and second regions change sequentially depending on the position of the intersection.
- the change method of the display contents in the sub-windows can be controlled simultaneously with resizing of the sub-windows. In this way, a desired display result can be obtained by a series of operations, thus improving the work efficiency.
- This embodiment proposes display control which is executed in association with the scrolling ON/OFF control method upon resizing a window, that is proposed by the present invention.
- display control executed upon resizing includes control for switching ON/OFF of scrolling or a scroll ratio of the display contents according to dragging of a border or corner, control for reducing or enlarging the display contents according to dragging of a border or corner, or the like.
- the display contents are scrolled upon resizing, the contents on an area opposite to the dragged part are hidden.
- the display contents are not scrolled upon resizing, the contents of an area near the dragged part are hidden.
- the “area opposite to the dragged part” is an area near a border opposite to the dragged border, or an area near two borders that do not contact the dragged corner.
- the “area near the dragged part” is an area near the dragged border or an area near two borders that contact the dragged corner.
- object images such as characters, patterns, photos, and the like, which are located on an area to be normally hidden, are displayed while being jammed into the area to be hidden, so as to allow the user to see them.
- display contents shown in FIG. 16 are assumed. This may be a normal window described in the first embodiment or may be a window which is described in the second embodiment, and is always maximized in one direction (Y direction) within the display screen.
- a left border 1601 is movable by dragging, and a window 1600 can be resized by moving this border 1601 .
- FIGS. 17A and 17B show display examples when the user resizes (reduces) the window by dragging the border in this embodiment.
- FIG. 17A shows a display example upon resizing with scrolling. With this display control, respective objects move to the right upon resizing, and their movement stops when these objects are brought into contact with the opposing border. In this case, the objects are displayed to overlap each other near the opposing border.
- FIG. 17B shows a display example upon resizing without scrolling.
- this display control since scrolling is not made, all objects are displayed without moving their position at the beginning of dragging of the border. However, when the dragged border moves to the right and is brought into contact with respective objects, these objects begin to move to the right. In this case, the objects are displayed to overlap each other near the dragged border. As the overlapping order, a newly stopped object may be displayed in front of or behind a preexistent object.
- display control as if objects attached to a window were being scooped by a wiper can be implemented, and objects which are normally hidden are displayed although imperfectly, thus improving the usability.
- FIGS. 18A and 18B show display examples upon resizing a window by dragging one corner of the window.
- FIG. 18A shows a display example upon resizing with scrolling
- FIG. 18B shows that upon resizing without scrolling.
- the respective operations have the same contents described using FIGS. 17A and 17B for X and Y components.
- FIG. 19A shows a display example upon resizing with scrolling.
- the following display control is executed. That is, respective objects move to the right upon resizing, and their movement stops when respective objects are brought into contact with the opposing border.
- the movement of that object stops at that time.
- objects are displayed not to overlap each other unlike in FIG. 17B .
- FIG. 19B shows a case upon resizing without scrolling.
- the following display control is executed. That is, all objects stand still initially. When the dragged border moves to the right and is brought into contact with respective objects, these objects begin to move to the right. In addition, when the objects which have already begun to move are brought into contact with other objects, the other objects begin to move at that time. As a result, objects are displayed not to overlap each other unlike in FIG. 17B .
- FIG. 20 is a flowchart showing an example of the window resizing processing corresponding to the display examples shown in FIGS. 17A and 17B .
- the processing corresponding to the flowchart shown in FIG. 20 is implemented when a CPU 101 reads out a corresponding processing program stored in an HD 103 onto a RAM 102 and executes that program to control respective components.
- the CPU 101 determines in step S 2001 whether or not the user begins to drag a border. If the user begins to drag the border (“YES” in step S 2001 ), the process advances to step S 2002 .
- the CPU 101 determines in step S 2002 if scrolling is ON simultaneously with resizing of a window by dragging. If it is determined that scrolling is OFF (“NO” in step S 2002 ), the process advances to step S 2003 ; otherwise (“YES” in step S 2002 ), the process advances to step S 2005 . Note that ON/OFF of scrolling can be determined according to the processes described in the first to fourth embodiments.
- a display area of an object O is expressed by O ⁇ (O1x, O1y), (O2x, O2y) ⁇ .
- (O1x, O1y) represents the coordinates of the upper left end of the object
- (O2x, O2y) represents the coordinates of the lower right end of the object.
- the left direction corresponds to a negative direction of the X-axis on an X-Y coordinate system 502 set in association with the display screen
- the up direction corresponds to a positive direction of the Y-axis.
- the right direction corresponds to a positive direction of the X-axis
- the down direction corresponds to a negative direction of the Y-axis.
- ⁇ O( ⁇ O1x, ⁇ O2x) be a change in display area O in the X-axis direction.
- the CPU 101 determines in step S 2003 whether or not there is an object which is in contact with the dragged border. This determination process can be attained by comparing the coordinates of the display position of the object, and those of the dragged border. At this time, when the X-coordinate Bx of the dragged border falls within a range O1x ⁇ Bx ⁇ O2x, it can be considered that the object is in contact with the dragged border. Note that since the flowchart of FIG. 20 assumes the case of FIGS. 17A and 17B , that is, the case of dragging the border in the X direction, only the coordinate in the X-axis direction is considered. In addition, when a border also moves in the Y direction, whether or not an object is in contact with the dragged border can be determined by seeing whether or not the position By of the border in the Y direction falls within the range of that object.
- step S 2003 If it is determined that there is an object that is in contact with the dragged border (“YES” in step S 2003 ), the process advances to step S 2004 . On the other hand, if it is determined that there is no object that is in contact with the dragged border (“NO” in step S 2003 ), the process jumps to step S 2007 .
- step S 2005 If scrolling is executed simultaneously with dragging of the border, the CPU 101 determines in step S 2005 whether or not there is an object that is in contact with the border opposite to the dragged border.
- step S 2005 If it is determined that there is an object that contacts the opposing border (“YES” in step S 2005 ), the process advances to step S 2006 . On the other hand, if it is determined that there is no object that contacts the opposing border (“NO” in step S 2005 ), the process jumps to step S 2007 .
- step S 2007 the CPU 101 updates display of the object which is in contact with the border based on the moving amount of the object determined in step S 2004 or S 2006 .
- the CPU 101 updates display of other objects according to ON/OFF of scrolling based on the determination result in step S 2002 .
- step S 2008 determines in step S 2008 whether or not the user ends dragging. If it is determined that the user ends dragging (“YES” in step S 2008 ), this processing ends. On the other hand, if it is determined that the user does not end dragging (“NO” in step S 2008 ), the process returns to step S 2002 to continue the processing.
- FIGS. 17A and 17B By extending the aforementioned processing also in the Y direction, the display control corresponding to FIGS. 18A and 18B can be implemented.
- the display control of objects within a display area can be implemented based on the presence/absence of a contact with the boundary or border in the same manner as described above.
- the above-described exemplary embodiments of the present invention can also be achieved by providing a computer-readable storage medium that stores program code of software (computer program) which realizes the operations of the above-described exemplary embodiments, to a system or an apparatus. Further, the above-described exemplary embodiments can be achieved by program code (computer program) stored in a storage medium read and executed by a computer (CPU or micro-processing unit (MPU)) of a system or an apparatus.
- a computer CPU or micro-processing unit (MPU)
- the computer program realizes each step included in the flowcharts of the above-mentioned exemplary embodiments.
- the computer program is a program that corresponds to each processing unit of each step included in the flowcharts for causing a computer to function.
- the computer program itself read from a computer-readable storage medium realizes the operations of the above-described exemplary embodiments, and the storage medium storing the computer program constitutes the present invention.
- the CPU jointly executes each step in the flowchart with a memory, hard disk, a display device and so on.
- the present invention is not limited to the above configuration, and a dedicated electronic circuit can perform a part or the whole of processes in each step described in each flowchart in place of the CPU.
Abstract
Description
Cx=Px (1)
Ly/3≦Py−Cy≦2Ly/3
0<Py−Cy<Ly/3 or 2Ly/3<Py−Cy<Ly
ΔQx=0 (2)
where ΔQx is a difference between Qx at the beginning of the resizing without scrolling, and Qx after the window is resized.
ΔQx=ΔCx=ΔPx (3)
where ΔQx is a difference between Qx at the beginning of the resizing with scrolling, and Qx after the window size is resized. Likewise, ΔCx and ΔPx are differences between Cx and Px at the beginning of the resizing with scrolling, and Cx and Px after the window is resized. Note that these differences correspond to change amounts of the
ΔC(ΔCx,ΔCy)=ΔP(ΔPx,ΔPy) (4)
ΔB(ΔBx,ΔBy)=ΔP(ΔPx,ΔPy) (5)
ΔQ(ΔQx,ΔQy)=ΔP(ΔPx,ΔPy) (6)
where Δ indicates a change amount.
ΔC(ΔCx,ΔCy)=ΔP(ΔPx,ΔPy) (7)
ΔB(ΔBx,ΔBy)=(0,0) (8)
ΔQ(ΔQx,ΔQy)=(0,0) (9)
LBy/2≦Py−BLy≦LBy
0<Py−BLy<LBy/2
ΔQLx=ΔQRx=0 (10)
where ΔQLx and ΔQRx are differences of QLx and QRx before and after resizing of the sub-windows.
ΔQLx=ΔQRx=ΔPx (11)
Claims (15)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/345,230 US9389746B2 (en) | 2007-07-17 | 2012-01-06 | Information processing apparatus and control method thereof, and computer program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2007-186326 | 2007-07-17 | ||
JP2007186326A JP5184832B2 (en) | 2007-07-17 | 2007-07-17 | Information processing apparatus, control method therefor, and computer program |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/345,230 Continuation US9389746B2 (en) | 2007-07-17 | 2012-01-06 | Information processing apparatus and control method thereof, and computer program |
Publications (2)
Publication Number | Publication Date |
---|---|
US20090024956A1 US20090024956A1 (en) | 2009-01-22 |
US8112716B2 true US8112716B2 (en) | 2012-02-07 |
Family
ID=40265879
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/170,994 Expired - Fee Related US8112716B2 (en) | 2007-07-17 | 2008-07-10 | Information processing apparatus and control method thereof, and computer program |
US13/345,230 Expired - Fee Related US9389746B2 (en) | 2007-07-17 | 2012-01-06 | Information processing apparatus and control method thereof, and computer program |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/345,230 Expired - Fee Related US9389746B2 (en) | 2007-07-17 | 2012-01-06 | Information processing apparatus and control method thereof, and computer program |
Country Status (2)
Country | Link |
---|---|
US (2) | US8112716B2 (en) |
JP (1) | JP5184832B2 (en) |
Cited By (136)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100302130A1 (en) * | 2009-05-29 | 2010-12-02 | Seiko Epson Corporation | Image display system, image display device, and image display method |
US8775972B2 (en) * | 2012-11-08 | 2014-07-08 | Snapchat, Inc. | Apparatus and method for single action control of social network profile access |
US9083770B1 (en) | 2013-11-26 | 2015-07-14 | Snapchat, Inc. | Method and system for integrating real time communication features in applications |
US9094137B1 (en) | 2014-06-13 | 2015-07-28 | Snapchat, Inc. | Priority based placement of messages in a geo-location based event gallery |
US9225897B1 (en) | 2014-07-07 | 2015-12-29 | Snapchat, Inc. | Apparatus and method for supplying content aware photo filters |
US9237202B1 (en) | 2014-03-07 | 2016-01-12 | Snapchat, Inc. | Content delivery network for ephemeral objects |
US9276886B1 (en) | 2014-05-09 | 2016-03-01 | Snapchat, Inc. | Apparatus and method for dynamically configuring application component tiles |
US9385983B1 (en) | 2014-12-19 | 2016-07-05 | Snapchat, Inc. | Gallery of messages from individuals with a shared interest |
US9396354B1 (en) | 2014-05-28 | 2016-07-19 | Snapchat, Inc. | Apparatus and method for automated privacy protection in distributed images |
US9537811B2 (en) | 2014-10-02 | 2017-01-03 | Snap Inc. | Ephemeral gallery of ephemeral messages |
US9705831B2 (en) | 2013-05-30 | 2017-07-11 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US9721394B2 (en) | 2012-08-22 | 2017-08-01 | Snaps Media, Inc. | Augmented reality virtual content platform apparatuses, methods and systems |
US9742713B2 (en) | 2013-05-30 | 2017-08-22 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US9843720B1 (en) | 2014-11-12 | 2017-12-12 | Snap Inc. | User interface for accessing media at a geographic location |
US9854219B2 (en) | 2014-12-19 | 2017-12-26 | Snap Inc. | Gallery of videos set to an audio time line |
US9866999B1 (en) | 2014-01-12 | 2018-01-09 | Investment Asset Holdings Llc | Location-based messaging |
US9936030B2 (en) | 2014-01-03 | 2018-04-03 | Investel Capital Corporation | User content sharing system and method with location-based external content integration |
US10055717B1 (en) | 2014-08-22 | 2018-08-21 | Snap Inc. | Message processor with application prompts |
US10084735B1 (en) | 2014-02-21 | 2018-09-25 | Snap Inc. | Apparatus and method for alternate channel communication initiated through a common message thread |
US10123166B2 (en) | 2015-01-26 | 2018-11-06 | Snap Inc. | Content request by location |
US10135949B1 (en) | 2015-05-05 | 2018-11-20 | Snap Inc. | Systems and methods for story and sub-story navigation |
US10133705B1 (en) | 2015-01-19 | 2018-11-20 | Snap Inc. | Multichannel system |
US10157449B1 (en) | 2015-01-09 | 2018-12-18 | Snap Inc. | Geo-location-based image filters |
US10165402B1 (en) | 2016-06-28 | 2018-12-25 | Snap Inc. | System to track engagement of media items |
US10203855B2 (en) | 2016-12-09 | 2019-02-12 | Snap Inc. | Customized user-controlled media overlays |
US10219111B1 (en) | 2018-04-18 | 2019-02-26 | Snap Inc. | Visitation tracking system |
US10223397B1 (en) | 2015-03-13 | 2019-03-05 | Snap Inc. | Social graph based co-location of network users |
US10284508B1 (en) | 2014-10-02 | 2019-05-07 | Snap Inc. | Ephemeral gallery of ephemeral messages with opt-in permanence |
US10311916B2 (en) | 2014-12-19 | 2019-06-04 | Snap Inc. | Gallery of videos set to an audio time line |
US10319149B1 (en) | 2017-02-17 | 2019-06-11 | Snap Inc. | Augmented reality anamorphosis system |
US10327096B1 (en) | 2018-03-06 | 2019-06-18 | Snap Inc. | Geo-fence selection system |
US10334307B2 (en) | 2011-07-12 | 2019-06-25 | Snap Inc. | Methods and systems of providing visual content editing functions |
US10339833B2 (en) * | 2011-09-12 | 2019-07-02 | Microsoft Technology Licensing, Llc | Assistive reading interface |
US10348662B2 (en) | 2016-07-19 | 2019-07-09 | Snap Inc. | Generating customized electronic messaging graphics |
US10354425B2 (en) | 2015-12-18 | 2019-07-16 | Snap Inc. | Method and system for providing context relevant media augmentation |
US10366543B1 (en) | 2015-10-30 | 2019-07-30 | Snap Inc. | Image based tracking in augmented reality systems |
US10387730B1 (en) | 2017-04-20 | 2019-08-20 | Snap Inc. | Augmented reality typography personalization system |
US10387514B1 (en) | 2016-06-30 | 2019-08-20 | Snap Inc. | Automated content curation and communication |
US10423983B2 (en) | 2014-09-16 | 2019-09-24 | Snap Inc. | Determining targeting information based on a predictive targeting model |
US10430838B1 (en) | 2016-06-28 | 2019-10-01 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections with automated advertising |
US10439972B1 (en) | 2013-05-30 | 2019-10-08 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US10474321B2 (en) | 2015-11-30 | 2019-11-12 | Snap Inc. | Network resource location linking and visual content sharing |
US10499191B1 (en) | 2017-10-09 | 2019-12-03 | Snap Inc. | Context sensitive presentation of content |
US10523625B1 (en) | 2017-03-09 | 2019-12-31 | Snap Inc. | Restricted group content collection |
US10581782B2 (en) | 2017-03-27 | 2020-03-03 | Snap Inc. | Generating a stitched data stream |
US10582277B2 (en) | 2017-03-27 | 2020-03-03 | Snap Inc. | Generating a stitched data stream |
US10592574B2 (en) | 2015-05-05 | 2020-03-17 | Snap Inc. | Systems and methods for automated local story generation and curation |
US10616239B2 (en) | 2015-03-18 | 2020-04-07 | Snap Inc. | Geo-fence authorization provisioning |
US10623666B2 (en) | 2016-11-07 | 2020-04-14 | Snap Inc. | Selective identification and order of image modifiers |
US10679393B2 (en) | 2018-07-24 | 2020-06-09 | Snap Inc. | Conditional modification of augmented reality object |
US10678818B2 (en) | 2018-01-03 | 2020-06-09 | Snap Inc. | Tag distribution visualization system |
US10679389B2 (en) | 2016-02-26 | 2020-06-09 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections |
US10740974B1 (en) | 2017-09-15 | 2020-08-11 | Snap Inc. | Augmented reality system |
US10817898B2 (en) | 2015-08-13 | 2020-10-27 | Placed, Llc | Determining exposures to content presented by physical objects |
US10824654B2 (en) | 2014-09-18 | 2020-11-03 | Snap Inc. | Geolocation-based pictographs |
US10834525B2 (en) | 2016-02-26 | 2020-11-10 | Snap Inc. | Generation, curation, and presentation of media collections |
US10862951B1 (en) | 2007-01-05 | 2020-12-08 | Snap Inc. | Real-time display of multiple images |
US10885136B1 (en) | 2018-02-28 | 2021-01-05 | Snap Inc. | Audience filtering system |
US10915911B2 (en) | 2017-02-03 | 2021-02-09 | Snap Inc. | System to determine a price-schedule to distribute media content |
US10933311B2 (en) | 2018-03-14 | 2021-03-02 | Snap Inc. | Generating collectible items based on location information |
US10948717B1 (en) | 2015-03-23 | 2021-03-16 | Snap Inc. | Reducing boot time and power consumption in wearable display systems |
US10952013B1 (en) | 2017-04-27 | 2021-03-16 | Snap Inc. | Selective location-based identity communication |
US10963529B1 (en) | 2017-04-27 | 2021-03-30 | Snap Inc. | Location-based search mechanism in a graphical user interface |
US10979752B1 (en) | 2018-02-28 | 2021-04-13 | Snap Inc. | Generating media content items based on location information |
US10993069B2 (en) | 2015-07-16 | 2021-04-27 | Snap Inc. | Dynamically adaptive media content delivery |
US10997783B2 (en) | 2015-11-30 | 2021-05-04 | Snap Inc. | Image and point cloud based tracking and in augmented reality systems |
US10997760B2 (en) | 2018-08-31 | 2021-05-04 | Snap Inc. | Augmented reality anthropomorphization system |
US11017173B1 (en) | 2017-12-22 | 2021-05-25 | Snap Inc. | Named entity recognition visual context and caption data |
US11023514B2 (en) | 2016-02-26 | 2021-06-01 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections |
US11030787B2 (en) | 2017-10-30 | 2021-06-08 | Snap Inc. | Mobile-based cartographic control of display content |
US11037372B2 (en) | 2017-03-06 | 2021-06-15 | Snap Inc. | Virtual vision system |
US11128715B1 (en) | 2019-12-30 | 2021-09-21 | Snap Inc. | Physical friend proximity in chat |
US11163941B1 (en) | 2018-03-30 | 2021-11-02 | Snap Inc. | Annotating a collection of media content items |
US11170393B1 (en) | 2017-04-11 | 2021-11-09 | Snap Inc. | System to calculate an engagement score of location based media content |
US11182383B1 (en) | 2012-02-24 | 2021-11-23 | Placed, Llc | System and method for data collection to validate location data |
US11189299B1 (en) | 2017-02-20 | 2021-11-30 | Snap Inc. | Augmented reality speech balloon system |
US11199957B1 (en) | 2018-11-30 | 2021-12-14 | Snap Inc. | Generating customized avatars based on location information |
US11206615B2 (en) | 2019-05-30 | 2021-12-21 | Snap Inc. | Wearable device location systems |
US11218838B2 (en) | 2019-10-31 | 2022-01-04 | Snap Inc. | Focused map-based context information surfacing |
US11216869B2 (en) | 2014-09-23 | 2022-01-04 | Snap Inc. | User interface to augment an image using geolocation |
US11228551B1 (en) | 2020-02-12 | 2022-01-18 | Snap Inc. | Multiple gateway message exchange |
US11232040B1 (en) | 2017-04-28 | 2022-01-25 | Snap Inc. | Precaching unlockable data elements |
US11250075B1 (en) | 2017-02-17 | 2022-02-15 | Snap Inc. | Searching social media content |
US11249614B2 (en) | 2019-03-28 | 2022-02-15 | Snap Inc. | Generating personalized map interface with enhanced icons |
US11265273B1 (en) | 2017-12-01 | 2022-03-01 | Snap, Inc. | Dynamic media overlay with smart widget |
US11290851B2 (en) | 2020-06-15 | 2022-03-29 | Snap Inc. | Location sharing using offline and online objects |
US11294936B1 (en) | 2019-01-30 | 2022-04-05 | Snap Inc. | Adaptive spatial density based clustering |
US11301117B2 (en) | 2019-03-08 | 2022-04-12 | Snap Inc. | Contextual information in chat |
US11314776B2 (en) | 2020-06-15 | 2022-04-26 | Snap Inc. | Location sharing using friend list versions |
US11343323B2 (en) | 2019-12-31 | 2022-05-24 | Snap Inc. | Augmented reality objects registry |
US11361493B2 (en) | 2019-04-01 | 2022-06-14 | Snap Inc. | Semantic texture mapping system |
US11388226B1 (en) | 2015-01-13 | 2022-07-12 | Snap Inc. | Guided personal identity based actions |
US11429618B2 (en) | 2019-12-30 | 2022-08-30 | Snap Inc. | Surfacing augmented reality objects |
US11430091B2 (en) | 2020-03-27 | 2022-08-30 | Snap Inc. | Location mapping for large scale augmented-reality |
US11455082B2 (en) | 2018-09-28 | 2022-09-27 | Snap Inc. | Collaborative achievement interface |
US11475254B1 (en) | 2017-09-08 | 2022-10-18 | Snap Inc. | Multimodal entity identification |
US11483267B2 (en) | 2020-06-15 | 2022-10-25 | Snap Inc. | Location sharing using different rate-limited links |
US11503432B2 (en) | 2020-06-15 | 2022-11-15 | Snap Inc. | Scalable real-time location sharing framework |
US11500525B2 (en) | 2019-02-25 | 2022-11-15 | Snap Inc. | Custom media overlay system |
US20220365632A1 (en) * | 2021-05-17 | 2022-11-17 | Apple Inc. | Interacting with notes user interfaces |
US11507614B1 (en) | 2018-02-13 | 2022-11-22 | Snap Inc. | Icon based tagging |
US11516167B2 (en) | 2020-03-05 | 2022-11-29 | Snap Inc. | Storing data based on device location |
US11558709B2 (en) | 2018-11-30 | 2023-01-17 | Snap Inc. | Position service to determine relative position to map features |
US11574431B2 (en) | 2019-02-26 | 2023-02-07 | Snap Inc. | Avatar based on weather |
US11601888B2 (en) | 2021-03-29 | 2023-03-07 | Snap Inc. | Determining location using multi-source geolocation data |
US11601783B2 (en) | 2019-06-07 | 2023-03-07 | Snap Inc. | Detection of a physical collision between two client devices in a location sharing system |
US11606755B2 (en) | 2019-05-30 | 2023-03-14 | Snap Inc. | Wearable device location systems architecture |
US11616745B2 (en) | 2017-01-09 | 2023-03-28 | Snap Inc. | Contextual generation and selection of customized media content |
US11619501B2 (en) | 2020-03-11 | 2023-04-04 | Snap Inc. | Avatar based on trip |
US11625443B2 (en) | 2014-06-05 | 2023-04-11 | Snap Inc. | Web document enhancement |
US11631276B2 (en) | 2016-03-31 | 2023-04-18 | Snap Inc. | Automated avatar generation |
US11645324B2 (en) | 2021-03-31 | 2023-05-09 | Snap Inc. | Location-based timeline media content system |
US11676378B2 (en) | 2020-06-29 | 2023-06-13 | Snap Inc. | Providing travel-based augmented reality content with a captured image |
US11675831B2 (en) | 2017-05-31 | 2023-06-13 | Snap Inc. | Geolocation based playlists |
US11714535B2 (en) | 2019-07-11 | 2023-08-01 | Snap Inc. | Edge gesture interface with smart interactions |
US11729343B2 (en) | 2019-12-30 | 2023-08-15 | Snap Inc. | Including video feed in message thread |
US11734712B2 (en) | 2012-02-24 | 2023-08-22 | Foursquare Labs, Inc. | Attributing in-store visits to media consumption based on data collected from user devices |
US11751015B2 (en) | 2019-01-16 | 2023-09-05 | Snap Inc. | Location-based context information sharing in a messaging system |
US11776256B2 (en) | 2020-03-27 | 2023-10-03 | Snap Inc. | Shared augmented reality system |
US11799811B2 (en) | 2018-10-31 | 2023-10-24 | Snap Inc. | Messaging and gaming applications communication platform |
US11809624B2 (en) | 2019-02-13 | 2023-11-07 | Snap Inc. | Sleep detection in a location sharing system |
US11816853B2 (en) | 2016-08-30 | 2023-11-14 | Snap Inc. | Systems and methods for simultaneous localization and mapping |
US11821742B2 (en) | 2019-09-26 | 2023-11-21 | Snap Inc. | Travel based notifications |
US11829834B2 (en) | 2021-10-29 | 2023-11-28 | Snap Inc. | Extended QR code |
US11843456B2 (en) | 2016-10-24 | 2023-12-12 | Snap Inc. | Generating and displaying customized avatars in media overlays |
US11842411B2 (en) | 2017-04-27 | 2023-12-12 | Snap Inc. | Location-based virtual avatars |
US11852554B1 (en) | 2019-03-21 | 2023-12-26 | Snap Inc. | Barometer calibration in a location sharing system |
US11860888B2 (en) | 2018-05-22 | 2024-01-02 | Snap Inc. | Event detection system |
US11870743B1 (en) | 2017-01-23 | 2024-01-09 | Snap Inc. | Customized digital avatar accessories |
US11868414B1 (en) | 2019-03-14 | 2024-01-09 | Snap Inc. | Graph-based prediction for contact suggestion in a location sharing system |
US11877211B2 (en) | 2019-01-14 | 2024-01-16 | Snap Inc. | Destination sharing in location sharing system |
US11893208B2 (en) | 2019-12-31 | 2024-02-06 | Snap Inc. | Combined map icon with action indicator |
US11900418B2 (en) | 2016-04-04 | 2024-02-13 | Snap Inc. | Mutable geo-fencing system |
US11925869B2 (en) | 2012-05-08 | 2024-03-12 | Snap Inc. | System and method for generating and displaying avatars |
US11943192B2 (en) | 2020-08-31 | 2024-03-26 | Snap Inc. | Co-location connection service |
US11956533B2 (en) | 2021-11-29 | 2024-04-09 | Snap Inc. | Accessing media at a geographic location |
Families Citing this family (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5279646B2 (en) * | 2008-09-03 | 2013-09-04 | キヤノン株式会社 | Information processing apparatus, operation method thereof, and program |
CN101672648A (en) * | 2008-09-12 | 2010-03-17 | 富士通天株式会社 | Information processing device and image processing device |
US9401099B2 (en) * | 2010-05-11 | 2016-07-26 | AI Squared | Dedicated on-screen closed caption display |
US10417018B2 (en) | 2011-05-27 | 2019-09-17 | Microsoft Technology Licensing, Llc | Navigation of immersive and desktop shells |
US9843665B2 (en) | 2011-05-27 | 2017-12-12 | Microsoft Technology Licensing, Llc | Display of immersive and desktop shells |
US8924885B2 (en) * | 2011-05-27 | 2014-12-30 | Microsoft Corporation | Desktop as immersive application |
JP5360140B2 (en) * | 2011-06-17 | 2013-12-04 | コニカミノルタ株式会社 | Information browsing apparatus, control program, and control method |
JP5604386B2 (en) * | 2011-07-29 | 2014-10-08 | 楽天株式会社 | Information processing apparatus, information processing apparatus control method, program, and information recording medium |
US9218123B2 (en) * | 2011-12-29 | 2015-12-22 | Apple Inc. | Device, method, and graphical user interface for resizing content viewing and text entry interfaces |
EP3182261A1 (en) | 2012-09-17 | 2017-06-21 | Huawei Device Co., Ltd. | Touch operation processing method and terminal device |
US9612713B2 (en) * | 2012-09-26 | 2017-04-04 | Google Inc. | Intelligent window management |
KR102089951B1 (en) * | 2013-03-14 | 2020-04-14 | 엘지전자 주식회사 | Mobile terminal and control method for the mobile terminal |
JP6188490B2 (en) * | 2013-08-28 | 2017-08-30 | キヤノン株式会社 | Image display apparatus, control method, and computer program |
KR20150037014A (en) * | 2013-09-30 | 2015-04-08 | 삼성전자주식회사 | Electronic device and method for providing user interface in electronic device |
KR102252321B1 (en) * | 2014-12-24 | 2021-05-14 | 삼성전자주식회사 | A display apparatus and a display method |
JP6647103B2 (en) * | 2016-03-23 | 2020-02-14 | キヤノン株式会社 | Display control device and control method thereof |
US10805696B1 (en) | 2016-06-20 | 2020-10-13 | Pipbin, Inc. | System for recording and targeting tagged content of user interest |
US11785161B1 (en) | 2016-06-20 | 2023-10-10 | Pipbin, Inc. | System for user accessibility of tagged curated augmented reality content |
US10638256B1 (en) | 2016-06-20 | 2020-04-28 | Pipbin, Inc. | System for distribution and display of mobile targeted augmented reality content |
US11876941B1 (en) | 2016-06-20 | 2024-01-16 | Pipbin, Inc. | Clickable augmented reality content manager, system, and network |
US10334134B1 (en) | 2016-06-20 | 2019-06-25 | Maximillian John Suiter | Augmented real estate with location and chattel tagging system and apparatus for virtual diary, scrapbooking, game play, messaging, canvasing, advertising and social interaction |
US11201981B1 (en) | 2016-06-20 | 2021-12-14 | Pipbin, Inc. | System for notification of user accessibility of curated location-dependent content in an augmented estate |
US11044393B1 (en) | 2016-06-20 | 2021-06-22 | Pipbin, Inc. | System for curation and display of location-dependent augmented reality content in an augmented estate system |
KR102375950B1 (en) * | 2016-12-02 | 2022-03-17 | 삼성전자주식회사 | Method for adjusting size of screen and electronic device thereof |
JP6773977B2 (en) * | 2017-03-01 | 2020-10-21 | 富士通クライアントコンピューティング株式会社 | Terminal device and operation control program |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09185480A (en) | 1995-12-28 | 1997-07-15 | Fuji Xerox Co Ltd | Multiwindow display device |
JP2765615B2 (en) | 1994-08-09 | 1998-06-18 | カシオ計算機株式会社 | Window display control device |
US5815151A (en) | 1996-03-08 | 1998-09-29 | International Business Machines Corp. | Graphical user interface |
JP3586351B2 (en) | 1997-03-21 | 2004-11-10 | インターナショナル・ビジネス・マシーンズ・コーポレーション | Window display device and method, and recording medium recording window display control program |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH03214362A (en) * | 1990-01-19 | 1991-09-19 | Fuji Xerox Co Ltd | Computer system |
JP3350570B2 (en) * | 1993-05-10 | 2002-11-25 | 富士通株式会社 | List display method |
JPH0793123A (en) * | 1993-09-20 | 1995-04-07 | Fujitsu Ltd | Display device |
JP3445341B2 (en) * | 1993-12-24 | 2003-09-08 | 株式会社東芝 | Window display device and window display method |
JP3404931B2 (en) * | 1994-11-15 | 2003-05-12 | カシオ計算機株式会社 | Table processing equipment |
CA2175148C (en) * | 1996-04-26 | 2002-06-11 | Robert Cecco | User interface control for creating split panes in a single window |
JP4281120B2 (en) * | 1998-01-16 | 2009-06-17 | ソニー株式会社 | Editing apparatus and method, and recording medium |
-
2007
- 2007-07-17 JP JP2007186326A patent/JP5184832B2/en not_active Expired - Fee Related
-
2008
- 2008-07-10 US US12/170,994 patent/US8112716B2/en not_active Expired - Fee Related
-
2012
- 2012-01-06 US US13/345,230 patent/US9389746B2/en not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2765615B2 (en) | 1994-08-09 | 1998-06-18 | カシオ計算機株式会社 | Window display control device |
JPH09185480A (en) | 1995-12-28 | 1997-07-15 | Fuji Xerox Co Ltd | Multiwindow display device |
US5815151A (en) | 1996-03-08 | 1998-09-29 | International Business Machines Corp. | Graphical user interface |
JP3431795B2 (en) | 1996-03-08 | 2003-07-28 | インターナショナル・ビジネス・マシーンズ・コーポレーション | Window resizing method and computer system |
JP3586351B2 (en) | 1997-03-21 | 2004-11-10 | インターナショナル・ビジネス・マシーンズ・コーポレーション | Window display device and method, and recording medium recording window display control program |
US7051289B1 (en) | 1997-03-21 | 2006-05-23 | International Business Machines Corporation | Window display device and method, and a recording medium recording a window display control program |
Cited By (323)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11588770B2 (en) | 2007-01-05 | 2023-02-21 | Snap Inc. | Real-time display of multiple images |
US10862951B1 (en) | 2007-01-05 | 2020-12-08 | Snap Inc. | Real-time display of multiple images |
US8791877B2 (en) | 2009-05-29 | 2014-07-29 | Seiko Epson Corporation | Image display system, image display device, and image display method |
US20100302130A1 (en) * | 2009-05-29 | 2010-12-02 | Seiko Epson Corporation | Image display system, image display device, and image display method |
US11451856B2 (en) | 2011-07-12 | 2022-09-20 | Snap Inc. | Providing visual content editing functions |
US10999623B2 (en) | 2011-07-12 | 2021-05-04 | Snap Inc. | Providing visual content editing functions |
US11750875B2 (en) | 2011-07-12 | 2023-09-05 | Snap Inc. | Providing visual content editing functions |
US10334307B2 (en) | 2011-07-12 | 2019-06-25 | Snap Inc. | Methods and systems of providing visual content editing functions |
US10339833B2 (en) * | 2011-09-12 | 2019-07-02 | Microsoft Technology Licensing, Llc | Assistive reading interface |
US11182383B1 (en) | 2012-02-24 | 2021-11-23 | Placed, Llc | System and method for data collection to validate location data |
US11734712B2 (en) | 2012-02-24 | 2023-08-22 | Foursquare Labs, Inc. | Attributing in-store visits to media consumption based on data collected from user devices |
US11925869B2 (en) | 2012-05-08 | 2024-03-12 | Snap Inc. | System and method for generating and displaying avatars |
US9721394B2 (en) | 2012-08-22 | 2017-08-01 | Snaps Media, Inc. | Augmented reality virtual content platform apparatuses, methods and systems |
US9792733B2 (en) | 2012-08-22 | 2017-10-17 | Snaps Media, Inc. | Augmented reality virtual content platform apparatuses, methods and systems |
US10169924B2 (en) | 2012-08-22 | 2019-01-01 | Snaps Media Inc. | Augmented reality virtual content platform apparatuses, methods and systems |
US10887308B1 (en) | 2012-11-08 | 2021-01-05 | Snap Inc. | Interactive user-interface to adjust access privileges |
US11252158B2 (en) | 2012-11-08 | 2022-02-15 | Snap Inc. | Interactive user-interface to adjust access privileges |
US8775972B2 (en) * | 2012-11-08 | 2014-07-08 | Snapchat, Inc. | Apparatus and method for single action control of social network profile access |
US9882907B1 (en) | 2012-11-08 | 2018-01-30 | Snap Inc. | Apparatus and method for single action control of social network profile access |
US11115361B2 (en) | 2013-05-30 | 2021-09-07 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US10439972B1 (en) | 2013-05-30 | 2019-10-08 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US11509618B2 (en) | 2013-05-30 | 2022-11-22 | Snap Inc. | Maintaining a message thread with opt-in permanence for entries |
US9742713B2 (en) | 2013-05-30 | 2017-08-22 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US10587552B1 (en) | 2013-05-30 | 2020-03-10 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US11134046B2 (en) | 2013-05-30 | 2021-09-28 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US9705831B2 (en) | 2013-05-30 | 2017-07-11 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US9794303B1 (en) | 2013-11-26 | 2017-10-17 | Snap Inc. | Method and system for integrating real time communication features in applications |
US9083770B1 (en) | 2013-11-26 | 2015-07-14 | Snapchat, Inc. | Method and system for integrating real time communication features in applications |
US11546388B2 (en) | 2013-11-26 | 2023-01-03 | Snap Inc. | Method and system for integrating real time communication features in applications |
US11102253B2 (en) | 2013-11-26 | 2021-08-24 | Snap Inc. | Method and system for integrating real time communication features in applications |
US10681092B1 (en) | 2013-11-26 | 2020-06-09 | Snap Inc. | Method and system for integrating real time communication features in applications |
US10069876B1 (en) | 2013-11-26 | 2018-09-04 | Snap Inc. | Method and system for integrating real time communication features in applications |
US9936030B2 (en) | 2014-01-03 | 2018-04-03 | Investel Capital Corporation | User content sharing system and method with location-based external content integration |
US10080102B1 (en) | 2014-01-12 | 2018-09-18 | Investment Asset Holdings Llc | Location-based messaging |
US9866999B1 (en) | 2014-01-12 | 2018-01-09 | Investment Asset Holdings Llc | Location-based messaging |
US10349209B1 (en) | 2014-01-12 | 2019-07-09 | Investment Asset Holdings Llc | Location-based messaging |
US11902235B2 (en) | 2014-02-21 | 2024-02-13 | Snap Inc. | Apparatus and method for alternate channel communication initiated through a common message thread |
US11463393B2 (en) | 2014-02-21 | 2022-10-04 | Snap Inc. | Apparatus and method for alternate channel communication initiated through a common message thread |
US10949049B1 (en) | 2014-02-21 | 2021-03-16 | Snap Inc. | Apparatus and method for alternate channel communication initiated through a common message thread |
US10082926B1 (en) | 2014-02-21 | 2018-09-25 | Snap Inc. | Apparatus and method for alternate channel communication initiated through a common message thread |
US11463394B2 (en) | 2014-02-21 | 2022-10-04 | Snap Inc. | Apparatus and method for alternate channel communication initiated through a common message thread |
US10084735B1 (en) | 2014-02-21 | 2018-09-25 | Snap Inc. | Apparatus and method for alternate channel communication initiated through a common message thread |
US10958605B1 (en) | 2014-02-21 | 2021-03-23 | Snap Inc. | Apparatus and method for alternate channel communication initiated through a common message thread |
US9407712B1 (en) | 2014-03-07 | 2016-08-02 | Snapchat, Inc. | Content delivery network for ephemeral objects |
US9237202B1 (en) | 2014-03-07 | 2016-01-12 | Snapchat, Inc. | Content delivery network for ephemeral objects |
US11310183B2 (en) | 2014-05-09 | 2022-04-19 | Snap Inc. | Dynamic configuration of application component tiles |
US9276886B1 (en) | 2014-05-09 | 2016-03-01 | Snapchat, Inc. | Apparatus and method for dynamically configuring application component tiles |
US11743219B2 (en) | 2014-05-09 | 2023-08-29 | Snap Inc. | Dynamic configuration of application component tiles |
US10817156B1 (en) | 2014-05-09 | 2020-10-27 | Snap Inc. | Dynamic configuration of application component tiles |
US10990697B2 (en) | 2014-05-28 | 2021-04-27 | Snap Inc. | Apparatus and method for automated privacy protection in distributed images |
US9396354B1 (en) | 2014-05-28 | 2016-07-19 | Snapchat, Inc. | Apparatus and method for automated privacy protection in distributed images |
US9785796B1 (en) | 2014-05-28 | 2017-10-10 | Snap Inc. | Apparatus and method for automated privacy protection in distributed images |
US10572681B1 (en) | 2014-05-28 | 2020-02-25 | Snap Inc. | Apparatus and method for automated privacy protection in distributed images |
US11921805B2 (en) | 2014-06-05 | 2024-03-05 | Snap Inc. | Web document enhancement |
US11625443B2 (en) | 2014-06-05 | 2023-04-11 | Snap Inc. | Web document enhancement |
US10200813B1 (en) | 2014-06-13 | 2019-02-05 | Snap Inc. | Geo-location based event gallery |
US9094137B1 (en) | 2014-06-13 | 2015-07-28 | Snapchat, Inc. | Priority based placement of messages in a geo-location based event gallery |
US9825898B2 (en) | 2014-06-13 | 2017-11-21 | Snap Inc. | Prioritization of messages within a message collection |
US10623891B2 (en) | 2014-06-13 | 2020-04-14 | Snap Inc. | Prioritization of messages within a message collection |
US10779113B2 (en) | 2014-06-13 | 2020-09-15 | Snap Inc. | Prioritization of messages within a message collection |
US10659914B1 (en) | 2014-06-13 | 2020-05-19 | Snap Inc. | Geo-location based event gallery |
US11317240B2 (en) | 2014-06-13 | 2022-04-26 | Snap Inc. | Geo-location based event gallery |
US9693191B2 (en) | 2014-06-13 | 2017-06-27 | Snap Inc. | Prioritization of messages within gallery |
US9532171B2 (en) | 2014-06-13 | 2016-12-27 | Snap Inc. | Geo-location based event gallery |
US9430783B1 (en) | 2014-06-13 | 2016-08-30 | Snapchat, Inc. | Prioritization of messages within gallery |
US10524087B1 (en) | 2014-06-13 | 2019-12-31 | Snap Inc. | Message destination list mechanism |
US9113301B1 (en) | 2014-06-13 | 2015-08-18 | Snapchat, Inc. | Geo-location based event gallery |
US11166121B2 (en) | 2014-06-13 | 2021-11-02 | Snap Inc. | Prioritization of messages within a message collection |
US10182311B2 (en) | 2014-06-13 | 2019-01-15 | Snap Inc. | Prioritization of messages within a message collection |
US10448201B1 (en) | 2014-06-13 | 2019-10-15 | Snap Inc. | Prioritization of messages within a message collection |
US11595569B2 (en) | 2014-07-07 | 2023-02-28 | Snap Inc. | Supplying content aware photo filters |
US11122200B2 (en) | 2014-07-07 | 2021-09-14 | Snap Inc. | Supplying content aware photo filters |
US9225897B1 (en) | 2014-07-07 | 2015-12-29 | Snapchat, Inc. | Apparatus and method for supplying content aware photo filters |
US10432850B1 (en) | 2014-07-07 | 2019-10-01 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US10154192B1 (en) | 2014-07-07 | 2018-12-11 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US11849214B2 (en) | 2014-07-07 | 2023-12-19 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US10602057B1 (en) | 2014-07-07 | 2020-03-24 | Snap Inc. | Supplying content aware photo filters |
US9407816B1 (en) | 2014-07-07 | 2016-08-02 | Snapchat, Inc. | Apparatus and method for supplying content aware photo filters |
US10701262B1 (en) | 2014-07-07 | 2020-06-30 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US11496673B1 (en) | 2014-07-07 | 2022-11-08 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US10348960B1 (en) | 2014-07-07 | 2019-07-09 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US10055717B1 (en) | 2014-08-22 | 2018-08-21 | Snap Inc. | Message processor with application prompts |
US11017363B1 (en) | 2014-08-22 | 2021-05-25 | Snap Inc. | Message processor with application prompts |
US10423983B2 (en) | 2014-09-16 | 2019-09-24 | Snap Inc. | Determining targeting information based on a predictive targeting model |
US11625755B1 (en) | 2014-09-16 | 2023-04-11 | Foursquare Labs, Inc. | Determining targeting information based on a predictive targeting model |
US10824654B2 (en) | 2014-09-18 | 2020-11-03 | Snap Inc. | Geolocation-based pictographs |
US11741136B2 (en) | 2014-09-18 | 2023-08-29 | Snap Inc. | Geolocation-based pictographs |
US11281701B2 (en) | 2014-09-18 | 2022-03-22 | Snap Inc. | Geolocation-based pictographs |
US11216869B2 (en) | 2014-09-23 | 2022-01-04 | Snap Inc. | User interface to augment an image using geolocation |
US10476830B2 (en) | 2014-10-02 | 2019-11-12 | Snap Inc. | Ephemeral gallery of ephemeral messages |
US10958608B1 (en) | 2014-10-02 | 2021-03-23 | Snap Inc. | Ephemeral gallery of visual media messages |
US10944710B1 (en) | 2014-10-02 | 2021-03-09 | Snap Inc. | Ephemeral gallery user interface with remaining gallery time indication |
US11038829B1 (en) | 2014-10-02 | 2021-06-15 | Snap Inc. | Ephemeral gallery of ephemeral messages with opt-in permanence |
US9537811B2 (en) | 2014-10-02 | 2017-01-03 | Snap Inc. | Ephemeral gallery of ephemeral messages |
US11012398B1 (en) | 2014-10-02 | 2021-05-18 | Snap Inc. | Ephemeral message gallery user interface with screenshot messages |
US11411908B1 (en) | 2014-10-02 | 2022-08-09 | Snap Inc. | Ephemeral message gallery user interface with online viewing history indicia |
US11855947B1 (en) | 2014-10-02 | 2023-12-26 | Snap Inc. | Gallery of ephemeral messages |
US10708210B1 (en) | 2014-10-02 | 2020-07-07 | Snap Inc. | Multi-user ephemeral message gallery |
US10284508B1 (en) | 2014-10-02 | 2019-05-07 | Snap Inc. | Ephemeral gallery of ephemeral messages with opt-in permanence |
US20170374003A1 (en) | 2014-10-02 | 2017-12-28 | Snapchat, Inc. | Ephemeral gallery of ephemeral messages |
US11522822B1 (en) | 2014-10-02 | 2022-12-06 | Snap Inc. | Ephemeral gallery elimination based on gallery and message timers |
US11190679B2 (en) | 2014-11-12 | 2021-11-30 | Snap Inc. | Accessing media at a geographic location |
US9843720B1 (en) | 2014-11-12 | 2017-12-12 | Snap Inc. | User interface for accessing media at a geographic location |
US10616476B1 (en) | 2014-11-12 | 2020-04-07 | Snap Inc. | User interface for accessing media at a geographic location |
US10811053B2 (en) | 2014-12-19 | 2020-10-20 | Snap Inc. | Routing messages by message parameter |
US11803345B2 (en) | 2014-12-19 | 2023-10-31 | Snap Inc. | Gallery of messages from individuals with a shared interest |
US11783862B2 (en) | 2014-12-19 | 2023-10-10 | Snap Inc. | Routing messages by message parameter |
US9385983B1 (en) | 2014-12-19 | 2016-07-05 | Snapchat, Inc. | Gallery of messages from individuals with a shared interest |
US10514876B2 (en) | 2014-12-19 | 2019-12-24 | Snap Inc. | Gallery of messages from individuals with a shared interest |
US10580458B2 (en) | 2014-12-19 | 2020-03-03 | Snap Inc. | Gallery of videos set to an audio time line |
US11250887B2 (en) | 2014-12-19 | 2022-02-15 | Snap Inc. | Routing messages by message parameter |
US10311916B2 (en) | 2014-12-19 | 2019-06-04 | Snap Inc. | Gallery of videos set to an audio time line |
US9854219B2 (en) | 2014-12-19 | 2017-12-26 | Snap Inc. | Gallery of videos set to an audio time line |
US11372608B2 (en) | 2014-12-19 | 2022-06-28 | Snap Inc. | Gallery of messages from individuals with a shared interest |
US11734342B2 (en) | 2015-01-09 | 2023-08-22 | Snap Inc. | Object recognition based image overlays |
US10380720B1 (en) | 2015-01-09 | 2019-08-13 | Snap Inc. | Location-based image filters |
US11301960B2 (en) | 2015-01-09 | 2022-04-12 | Snap Inc. | Object recognition based image filters |
US10157449B1 (en) | 2015-01-09 | 2018-12-18 | Snap Inc. | Geo-location-based image filters |
US11388226B1 (en) | 2015-01-13 | 2022-07-12 | Snap Inc. | Guided personal identity based actions |
US10416845B1 (en) | 2015-01-19 | 2019-09-17 | Snap Inc. | Multichannel system |
US11249617B1 (en) | 2015-01-19 | 2022-02-15 | Snap Inc. | Multichannel system |
US10133705B1 (en) | 2015-01-19 | 2018-11-20 | Snap Inc. | Multichannel system |
US10536800B1 (en) | 2015-01-26 | 2020-01-14 | Snap Inc. | Content request by location |
US10123166B2 (en) | 2015-01-26 | 2018-11-06 | Snap Inc. | Content request by location |
US11910267B2 (en) | 2015-01-26 | 2024-02-20 | Snap Inc. | Content request by location |
US10932085B1 (en) | 2015-01-26 | 2021-02-23 | Snap Inc. | Content request by location |
US11528579B2 (en) | 2015-01-26 | 2022-12-13 | Snap Inc. | Content request by location |
US10223397B1 (en) | 2015-03-13 | 2019-03-05 | Snap Inc. | Social graph based co-location of network users |
US10616239B2 (en) | 2015-03-18 | 2020-04-07 | Snap Inc. | Geo-fence authorization provisioning |
US11902287B2 (en) | 2015-03-18 | 2024-02-13 | Snap Inc. | Geo-fence authorization provisioning |
US10893055B2 (en) | 2015-03-18 | 2021-01-12 | Snap Inc. | Geo-fence authorization provisioning |
US11662576B2 (en) | 2015-03-23 | 2023-05-30 | Snap Inc. | Reducing boot time and power consumption in displaying data content |
US10948717B1 (en) | 2015-03-23 | 2021-03-16 | Snap Inc. | Reducing boot time and power consumption in wearable display systems |
US11320651B2 (en) | 2015-03-23 | 2022-05-03 | Snap Inc. | Reducing boot time and power consumption in displaying data content |
US11496544B2 (en) | 2015-05-05 | 2022-11-08 | Snap Inc. | Story and sub-story navigation |
US10911575B1 (en) | 2015-05-05 | 2021-02-02 | Snap Inc. | Systems and methods for story and sub-story navigation |
US10135949B1 (en) | 2015-05-05 | 2018-11-20 | Snap Inc. | Systems and methods for story and sub-story navigation |
US10592574B2 (en) | 2015-05-05 | 2020-03-17 | Snap Inc. | Systems and methods for automated local story generation and curation |
US11449539B2 (en) | 2015-05-05 | 2022-09-20 | Snap Inc. | Automated local story generation and curation |
US11392633B2 (en) | 2015-05-05 | 2022-07-19 | Snap Inc. | Systems and methods for automated local story generation and curation |
US10993069B2 (en) | 2015-07-16 | 2021-04-27 | Snap Inc. | Dynamically adaptive media content delivery |
US10817898B2 (en) | 2015-08-13 | 2020-10-27 | Placed, Llc | Determining exposures to content presented by physical objects |
US10733802B2 (en) | 2015-10-30 | 2020-08-04 | Snap Inc. | Image based tracking in augmented reality systems |
US11769307B2 (en) | 2015-10-30 | 2023-09-26 | Snap Inc. | Image based tracking in augmented reality systems |
US11315331B2 (en) | 2015-10-30 | 2022-04-26 | Snap Inc. | Image based tracking in augmented reality systems |
US10366543B1 (en) | 2015-10-30 | 2019-07-30 | Snap Inc. | Image based tracking in augmented reality systems |
US10997783B2 (en) | 2015-11-30 | 2021-05-04 | Snap Inc. | Image and point cloud based tracking and in augmented reality systems |
US10474321B2 (en) | 2015-11-30 | 2019-11-12 | Snap Inc. | Network resource location linking and visual content sharing |
US11599241B2 (en) | 2015-11-30 | 2023-03-07 | Snap Inc. | Network resource location linking and visual content sharing |
US11380051B2 (en) | 2015-11-30 | 2022-07-05 | Snap Inc. | Image and point cloud based tracking and in augmented reality systems |
US11830117B2 (en) | 2015-12-18 | 2023-11-28 | Snap Inc | Media overlay publication system |
US10354425B2 (en) | 2015-12-18 | 2019-07-16 | Snap Inc. | Method and system for providing context relevant media augmentation |
US10997758B1 (en) | 2015-12-18 | 2021-05-04 | Snap Inc. | Media overlay publication system |
US11468615B2 (en) | 2015-12-18 | 2022-10-11 | Snap Inc. | Media overlay publication system |
US10679389B2 (en) | 2016-02-26 | 2020-06-09 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections |
US11023514B2 (en) | 2016-02-26 | 2021-06-01 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections |
US10834525B2 (en) | 2016-02-26 | 2020-11-10 | Snap Inc. | Generation, curation, and presentation of media collections |
US11889381B2 (en) | 2016-02-26 | 2024-01-30 | Snap Inc. | Generation, curation, and presentation of media collections |
US11197123B2 (en) | 2016-02-26 | 2021-12-07 | Snap Inc. | Generation, curation, and presentation of media collections |
US11611846B2 (en) | 2016-02-26 | 2023-03-21 | Snap Inc. | Generation, curation, and presentation of media collections |
US11631276B2 (en) | 2016-03-31 | 2023-04-18 | Snap Inc. | Automated avatar generation |
US11900418B2 (en) | 2016-04-04 | 2024-02-13 | Snap Inc. | Mutable geo-fencing system |
US11445326B2 (en) | 2016-06-28 | 2022-09-13 | Snap Inc. | Track engagement of media items |
US10430838B1 (en) | 2016-06-28 | 2019-10-01 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections with automated advertising |
US10165402B1 (en) | 2016-06-28 | 2018-12-25 | Snap Inc. | System to track engagement of media items |
US11640625B2 (en) | 2016-06-28 | 2023-05-02 | Snap Inc. | Generation, curation, and presentation of media collections with automated advertising |
US10885559B1 (en) | 2016-06-28 | 2021-01-05 | Snap Inc. | Generation, curation, and presentation of media collections with automated advertising |
US10785597B2 (en) | 2016-06-28 | 2020-09-22 | Snap Inc. | System to track engagement of media items |
US10219110B2 (en) | 2016-06-28 | 2019-02-26 | Snap Inc. | System to track engagement of media items |
US10327100B1 (en) | 2016-06-28 | 2019-06-18 | Snap Inc. | System to track engagement of media items |
US10506371B2 (en) | 2016-06-28 | 2019-12-10 | Snap Inc. | System to track engagement of media items |
US10735892B2 (en) | 2016-06-28 | 2020-08-04 | Snap Inc. | System to track engagement of media items |
US11895068B2 (en) | 2016-06-30 | 2024-02-06 | Snap Inc. | Automated content curation and communication |
US10387514B1 (en) | 2016-06-30 | 2019-08-20 | Snap Inc. | Automated content curation and communication |
US11080351B1 (en) | 2016-06-30 | 2021-08-03 | Snap Inc. | Automated content curation and communication |
US11509615B2 (en) | 2016-07-19 | 2022-11-22 | Snap Inc. | Generating customized electronic messaging graphics |
US10348662B2 (en) | 2016-07-19 | 2019-07-09 | Snap Inc. | Generating customized electronic messaging graphics |
US11816853B2 (en) | 2016-08-30 | 2023-11-14 | Snap Inc. | Systems and methods for simultaneous localization and mapping |
US11876762B1 (en) | 2016-10-24 | 2024-01-16 | Snap Inc. | Generating and displaying customized avatars in media overlays |
US11843456B2 (en) | 2016-10-24 | 2023-12-12 | Snap Inc. | Generating and displaying customized avatars in media overlays |
US10623666B2 (en) | 2016-11-07 | 2020-04-14 | Snap Inc. | Selective identification and order of image modifiers |
US11233952B2 (en) | 2016-11-07 | 2022-01-25 | Snap Inc. | Selective identification and order of image modifiers |
US11750767B2 (en) | 2016-11-07 | 2023-09-05 | Snap Inc. | Selective identification and order of image modifiers |
US11397517B2 (en) | 2016-12-09 | 2022-07-26 | Snap Inc. | Customized media overlays |
US10203855B2 (en) | 2016-12-09 | 2019-02-12 | Snap Inc. | Customized user-controlled media overlays |
US10754525B1 (en) | 2016-12-09 | 2020-08-25 | Snap Inc. | Customized media overlays |
US11616745B2 (en) | 2017-01-09 | 2023-03-28 | Snap Inc. | Contextual generation and selection of customized media content |
US11870743B1 (en) | 2017-01-23 | 2024-01-09 | Snap Inc. | Customized digital avatar accessories |
US10915911B2 (en) | 2017-02-03 | 2021-02-09 | Snap Inc. | System to determine a price-schedule to distribute media content |
US11720640B2 (en) | 2017-02-17 | 2023-08-08 | Snap Inc. | Searching social media content |
US11250075B1 (en) | 2017-02-17 | 2022-02-15 | Snap Inc. | Searching social media content |
US10319149B1 (en) | 2017-02-17 | 2019-06-11 | Snap Inc. | Augmented reality anamorphosis system |
US11861795B1 (en) | 2017-02-17 | 2024-01-02 | Snap Inc. | Augmented reality anamorphosis system |
US11748579B2 (en) | 2017-02-20 | 2023-09-05 | Snap Inc. | Augmented reality speech balloon system |
US11189299B1 (en) | 2017-02-20 | 2021-11-30 | Snap Inc. | Augmented reality speech balloon system |
US11670057B2 (en) | 2017-03-06 | 2023-06-06 | Snap Inc. | Virtual vision system |
US11037372B2 (en) | 2017-03-06 | 2021-06-15 | Snap Inc. | Virtual vision system |
US10887269B1 (en) | 2017-03-09 | 2021-01-05 | Snap Inc. | Restricted group content collection |
US11258749B2 (en) | 2017-03-09 | 2022-02-22 | Snap Inc. | Restricted group content collection |
US10523625B1 (en) | 2017-03-09 | 2019-12-31 | Snap Inc. | Restricted group content collection |
US10581782B2 (en) | 2017-03-27 | 2020-03-03 | Snap Inc. | Generating a stitched data stream |
US11297399B1 (en) | 2017-03-27 | 2022-04-05 | Snap Inc. | Generating a stitched data stream |
US10582277B2 (en) | 2017-03-27 | 2020-03-03 | Snap Inc. | Generating a stitched data stream |
US11558678B2 (en) | 2017-03-27 | 2023-01-17 | Snap Inc. | Generating a stitched data stream |
US11349796B2 (en) | 2017-03-27 | 2022-05-31 | Snap Inc. | Generating a stitched data stream |
US11170393B1 (en) | 2017-04-11 | 2021-11-09 | Snap Inc. | System to calculate an engagement score of location based media content |
US10387730B1 (en) | 2017-04-20 | 2019-08-20 | Snap Inc. | Augmented reality typography personalization system |
US11195018B1 (en) | 2017-04-20 | 2021-12-07 | Snap Inc. | Augmented reality typography personalization system |
US11893647B2 (en) | 2017-04-27 | 2024-02-06 | Snap Inc. | Location-based virtual avatars |
US11782574B2 (en) | 2017-04-27 | 2023-10-10 | Snap Inc. | Map-based graphical user interface indicating geospatial activity metrics |
US11451956B1 (en) | 2017-04-27 | 2022-09-20 | Snap Inc. | Location privacy management on map-based social media platforms |
US10952013B1 (en) | 2017-04-27 | 2021-03-16 | Snap Inc. | Selective location-based identity communication |
US11842411B2 (en) | 2017-04-27 | 2023-12-12 | Snap Inc. | Location-based virtual avatars |
US11474663B2 (en) | 2017-04-27 | 2022-10-18 | Snap Inc. | Location-based search mechanism in a graphical user interface |
US11385763B2 (en) | 2017-04-27 | 2022-07-12 | Snap Inc. | Map-based graphical user interface indicating geospatial activity metrics |
US11392264B1 (en) | 2017-04-27 | 2022-07-19 | Snap Inc. | Map-based graphical user interface for multi-type social media galleries |
US10963529B1 (en) | 2017-04-27 | 2021-03-30 | Snap Inc. | Location-based search mechanism in a graphical user interface |
US11409407B2 (en) | 2017-04-27 | 2022-08-09 | Snap Inc. | Map-based graphical user interface indicating geospatial activity metrics |
US11556221B2 (en) | 2017-04-27 | 2023-01-17 | Snap Inc. | Friend location sharing mechanism for social media platforms |
US11418906B2 (en) | 2017-04-27 | 2022-08-16 | Snap Inc. | Selective location-based identity communication |
US11232040B1 (en) | 2017-04-28 | 2022-01-25 | Snap Inc. | Precaching unlockable data elements |
US11675831B2 (en) | 2017-05-31 | 2023-06-13 | Snap Inc. | Geolocation based playlists |
US11475254B1 (en) | 2017-09-08 | 2022-10-18 | Snap Inc. | Multimodal entity identification |
US11721080B2 (en) | 2017-09-15 | 2023-08-08 | Snap Inc. | Augmented reality system |
US10740974B1 (en) | 2017-09-15 | 2020-08-11 | Snap Inc. | Augmented reality system |
US11335067B2 (en) | 2017-09-15 | 2022-05-17 | Snap Inc. | Augmented reality system |
US10499191B1 (en) | 2017-10-09 | 2019-12-03 | Snap Inc. | Context sensitive presentation of content |
US11617056B2 (en) | 2017-10-09 | 2023-03-28 | Snap Inc. | Context sensitive presentation of content |
US11006242B1 (en) | 2017-10-09 | 2021-05-11 | Snap Inc. | Context sensitive presentation of content |
US11670025B2 (en) | 2017-10-30 | 2023-06-06 | Snap Inc. | Mobile-based cartographic control of display content |
US11030787B2 (en) | 2017-10-30 | 2021-06-08 | Snap Inc. | Mobile-based cartographic control of display content |
US11265273B1 (en) | 2017-12-01 | 2022-03-01 | Snap, Inc. | Dynamic media overlay with smart widget |
US11558327B2 (en) | 2017-12-01 | 2023-01-17 | Snap Inc. | Dynamic media overlay with smart widget |
US11943185B2 (en) | 2017-12-01 | 2024-03-26 | Snap Inc. | Dynamic media overlay with smart widget |
US11017173B1 (en) | 2017-12-22 | 2021-05-25 | Snap Inc. | Named entity recognition visual context and caption data |
US11687720B2 (en) | 2017-12-22 | 2023-06-27 | Snap Inc. | Named entity recognition visual context and caption data |
US11487794B2 (en) | 2018-01-03 | 2022-11-01 | Snap Inc. | Tag distribution visualization system |
US10678818B2 (en) | 2018-01-03 | 2020-06-09 | Snap Inc. | Tag distribution visualization system |
US11841896B2 (en) | 2018-02-13 | 2023-12-12 | Snap Inc. | Icon based tagging |
US11507614B1 (en) | 2018-02-13 | 2022-11-22 | Snap Inc. | Icon based tagging |
US10885136B1 (en) | 2018-02-28 | 2021-01-05 | Snap Inc. | Audience filtering system |
US11523159B2 (en) | 2018-02-28 | 2022-12-06 | Snap Inc. | Generating media content items based on location information |
US10979752B1 (en) | 2018-02-28 | 2021-04-13 | Snap Inc. | Generating media content items based on location information |
US10327096B1 (en) | 2018-03-06 | 2019-06-18 | Snap Inc. | Geo-fence selection system |
US11570572B2 (en) | 2018-03-06 | 2023-01-31 | Snap Inc. | Geo-fence selection system |
US11044574B2 (en) | 2018-03-06 | 2021-06-22 | Snap Inc. | Geo-fence selection system |
US10524088B2 (en) | 2018-03-06 | 2019-12-31 | Snap Inc. | Geo-fence selection system |
US11722837B2 (en) | 2018-03-06 | 2023-08-08 | Snap Inc. | Geo-fence selection system |
US11491393B2 (en) | 2018-03-14 | 2022-11-08 | Snap Inc. | Generating collectible items based on location information |
US10933311B2 (en) | 2018-03-14 | 2021-03-02 | Snap Inc. | Generating collectible items based on location information |
US11163941B1 (en) | 2018-03-30 | 2021-11-02 | Snap Inc. | Annotating a collection of media content items |
US10779114B2 (en) | 2018-04-18 | 2020-09-15 | Snap Inc. | Visitation tracking system |
US10219111B1 (en) | 2018-04-18 | 2019-02-26 | Snap Inc. | Visitation tracking system |
US10448199B1 (en) | 2018-04-18 | 2019-10-15 | Snap Inc. | Visitation tracking system |
US10681491B1 (en) | 2018-04-18 | 2020-06-09 | Snap Inc. | Visitation tracking system |
US10924886B2 (en) | 2018-04-18 | 2021-02-16 | Snap Inc. | Visitation tracking system |
US11683657B2 (en) | 2018-04-18 | 2023-06-20 | Snap Inc. | Visitation tracking system |
US11297463B2 (en) | 2018-04-18 | 2022-04-05 | Snap Inc. | Visitation tracking system |
US11860888B2 (en) | 2018-05-22 | 2024-01-02 | Snap Inc. | Event detection system |
US10789749B2 (en) | 2018-07-24 | 2020-09-29 | Snap Inc. | Conditional modification of augmented reality object |
US10679393B2 (en) | 2018-07-24 | 2020-06-09 | Snap Inc. | Conditional modification of augmented reality object |
US10943381B2 (en) | 2018-07-24 | 2021-03-09 | Snap Inc. | Conditional modification of augmented reality object |
US11367234B2 (en) | 2018-07-24 | 2022-06-21 | Snap Inc. | Conditional modification of augmented reality object |
US11670026B2 (en) | 2018-07-24 | 2023-06-06 | Snap Inc. | Conditional modification of augmented reality object |
US11676319B2 (en) | 2018-08-31 | 2023-06-13 | Snap Inc. | Augmented reality anthropomorphtzation system |
US11450050B2 (en) | 2018-08-31 | 2022-09-20 | Snap Inc. | Augmented reality anthropomorphization system |
US10997760B2 (en) | 2018-08-31 | 2021-05-04 | Snap Inc. | Augmented reality anthropomorphization system |
US11704005B2 (en) | 2018-09-28 | 2023-07-18 | Snap Inc. | Collaborative achievement interface |
US11455082B2 (en) | 2018-09-28 | 2022-09-27 | Snap Inc. | Collaborative achievement interface |
US11799811B2 (en) | 2018-10-31 | 2023-10-24 | Snap Inc. | Messaging and gaming applications communication platform |
US11199957B1 (en) | 2018-11-30 | 2021-12-14 | Snap Inc. | Generating customized avatars based on location information |
US11698722B2 (en) | 2018-11-30 | 2023-07-11 | Snap Inc. | Generating customized avatars based on location information |
US11558709B2 (en) | 2018-11-30 | 2023-01-17 | Snap Inc. | Position service to determine relative position to map features |
US11812335B2 (en) | 2018-11-30 | 2023-11-07 | Snap Inc. | Position service to determine relative position to map features |
US11877211B2 (en) | 2019-01-14 | 2024-01-16 | Snap Inc. | Destination sharing in location sharing system |
US11751015B2 (en) | 2019-01-16 | 2023-09-05 | Snap Inc. | Location-based context information sharing in a messaging system |
US11294936B1 (en) | 2019-01-30 | 2022-04-05 | Snap Inc. | Adaptive spatial density based clustering |
US11693887B2 (en) | 2019-01-30 | 2023-07-04 | Snap Inc. | Adaptive spatial density based clustering |
US11809624B2 (en) | 2019-02-13 | 2023-11-07 | Snap Inc. | Sleep detection in a location sharing system |
US11500525B2 (en) | 2019-02-25 | 2022-11-15 | Snap Inc. | Custom media overlay system |
US11574431B2 (en) | 2019-02-26 | 2023-02-07 | Snap Inc. | Avatar based on weather |
US11301117B2 (en) | 2019-03-08 | 2022-04-12 | Snap Inc. | Contextual information in chat |
US11868414B1 (en) | 2019-03-14 | 2024-01-09 | Snap Inc. | Graph-based prediction for contact suggestion in a location sharing system |
US11852554B1 (en) | 2019-03-21 | 2023-12-26 | Snap Inc. | Barometer calibration in a location sharing system |
US11740760B2 (en) | 2019-03-28 | 2023-08-29 | Snap Inc. | Generating personalized map interface with enhanced icons |
US11249614B2 (en) | 2019-03-28 | 2022-02-15 | Snap Inc. | Generating personalized map interface with enhanced icons |
US11361493B2 (en) | 2019-04-01 | 2022-06-14 | Snap Inc. | Semantic texture mapping system |
US11606755B2 (en) | 2019-05-30 | 2023-03-14 | Snap Inc. | Wearable device location systems architecture |
US11785549B2 (en) | 2019-05-30 | 2023-10-10 | Snap Inc. | Wearable device location systems |
US11206615B2 (en) | 2019-05-30 | 2021-12-21 | Snap Inc. | Wearable device location systems |
US11917495B2 (en) | 2019-06-07 | 2024-02-27 | Snap Inc. | Detection of a physical collision between two client devices in a location sharing system |
US11601783B2 (en) | 2019-06-07 | 2023-03-07 | Snap Inc. | Detection of a physical collision between two client devices in a location sharing system |
US11714535B2 (en) | 2019-07-11 | 2023-08-01 | Snap Inc. | Edge gesture interface with smart interactions |
US11821742B2 (en) | 2019-09-26 | 2023-11-21 | Snap Inc. | Travel based notifications |
US11218838B2 (en) | 2019-10-31 | 2022-01-04 | Snap Inc. | Focused map-based context information surfacing |
US11128715B1 (en) | 2019-12-30 | 2021-09-21 | Snap Inc. | Physical friend proximity in chat |
US11729343B2 (en) | 2019-12-30 | 2023-08-15 | Snap Inc. | Including video feed in message thread |
US11429618B2 (en) | 2019-12-30 | 2022-08-30 | Snap Inc. | Surfacing augmented reality objects |
US11943303B2 (en) | 2019-12-31 | 2024-03-26 | Snap Inc. | Augmented reality objects registry |
US11343323B2 (en) | 2019-12-31 | 2022-05-24 | Snap Inc. | Augmented reality objects registry |
US11893208B2 (en) | 2019-12-31 | 2024-02-06 | Snap Inc. | Combined map icon with action indicator |
US11228551B1 (en) | 2020-02-12 | 2022-01-18 | Snap Inc. | Multiple gateway message exchange |
US11888803B2 (en) | 2020-02-12 | 2024-01-30 | Snap Inc. | Multiple gateway message exchange |
US11516167B2 (en) | 2020-03-05 | 2022-11-29 | Snap Inc. | Storing data based on device location |
US11765117B2 (en) | 2020-03-05 | 2023-09-19 | Snap Inc. | Storing data based on device location |
US11619501B2 (en) | 2020-03-11 | 2023-04-04 | Snap Inc. | Avatar based on trip |
US11430091B2 (en) | 2020-03-27 | 2022-08-30 | Snap Inc. | Location mapping for large scale augmented-reality |
US11915400B2 (en) | 2020-03-27 | 2024-02-27 | Snap Inc. | Location mapping for large scale augmented-reality |
US11776256B2 (en) | 2020-03-27 | 2023-10-03 | Snap Inc. | Shared augmented reality system |
US11503432B2 (en) | 2020-06-15 | 2022-11-15 | Snap Inc. | Scalable real-time location sharing framework |
US11290851B2 (en) | 2020-06-15 | 2022-03-29 | Snap Inc. | Location sharing using offline and online objects |
US11314776B2 (en) | 2020-06-15 | 2022-04-26 | Snap Inc. | Location sharing using friend list versions |
US11483267B2 (en) | 2020-06-15 | 2022-10-25 | Snap Inc. | Location sharing using different rate-limited links |
US11676378B2 (en) | 2020-06-29 | 2023-06-13 | Snap Inc. | Providing travel-based augmented reality content with a captured image |
US11943192B2 (en) | 2020-08-31 | 2024-03-26 | Snap Inc. | Co-location connection service |
US11601888B2 (en) | 2021-03-29 | 2023-03-07 | Snap Inc. | Determining location using multi-source geolocation data |
US11606756B2 (en) | 2021-03-29 | 2023-03-14 | Snap Inc. | Scheduling requests for location data |
US11902902B2 (en) | 2021-03-29 | 2024-02-13 | Snap Inc. | Scheduling requests for location data |
US11645324B2 (en) | 2021-03-31 | 2023-05-09 | Snap Inc. | Location-based timeline media content system |
US20220365632A1 (en) * | 2021-05-17 | 2022-11-17 | Apple Inc. | Interacting with notes user interfaces |
US11829834B2 (en) | 2021-10-29 | 2023-11-28 | Snap Inc. | Extended QR code |
US11956533B2 (en) | 2021-11-29 | 2024-04-09 | Snap Inc. | Accessing media at a geographic location |
US11954314B2 (en) | 2022-09-09 | 2024-04-09 | Snap Inc. | Custom media overlay system |
Also Published As
Publication number | Publication date |
---|---|
JP5184832B2 (en) | 2013-04-17 |
US9389746B2 (en) | 2016-07-12 |
JP2009025920A (en) | 2009-02-05 |
US20120102430A1 (en) | 2012-04-26 |
US20090024956A1 (en) | 2009-01-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8112716B2 (en) | Information processing apparatus and control method thereof, and computer program | |
JP4086050B2 (en) | Information management program and information management apparatus | |
WO2012043255A1 (en) | Image editing method and device, and image editing program | |
US20100162163A1 (en) | Image magnification | |
JP4397347B2 (en) | Input device | |
EP2606419B1 (en) | Touch-sensitive electronic device | |
US9082050B2 (en) | Computer-readable medium storing image processing program and image processing apparatus for improving the operability of image arrangement in a print preview screen | |
JP5400578B2 (en) | Display control apparatus and control method thereof | |
US6993709B1 (en) | Smart corner move snapping | |
US8928919B2 (en) | Computer-readable medium storing image processing program and image processing apparatus | |
JP6248462B2 (en) | Information processing apparatus and program | |
JP2012084063A (en) | Display control apparatus, display control method, and program | |
US9785333B2 (en) | Display device, image processing apparatus, non-transitory computer readable medium, and display control method | |
JP5457765B2 (en) | Information processing apparatus and control method thereof | |
US9632697B2 (en) | Information processing apparatus and control method thereof, and non-transitory computer-readable medium | |
US9292185B2 (en) | Display device and display method | |
US20170038953A1 (en) | Display apparatus and display method for displaying main data and data related to that main data, and a memory medium | |
US10963137B2 (en) | Information display apparatus and non-transitory recording medium storing program for controlling information display apparatus | |
JP7130514B2 (en) | Information processing device and its control method and program | |
US20170351423A1 (en) | Information processing apparatus, information processing method and computer-readable storage medium storing program | |
JP7447494B2 (en) | Display device and display control program | |
JP6365268B2 (en) | Display device, image forming apparatus, display method, and display program | |
JP2004094385A (en) | Area selecting system and method for image inputting device, and its program | |
US11947787B2 (en) | Image processing apparatus, image processing method, and storage medium | |
US11900044B2 (en) | Display method and display apparatus for displaying page image and thumbnail images of page image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: CANON KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KOBAYASHI, KIWAMU;REEL/FRAME:021296/0775 Effective date: 20080708 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20200207 |