US20130222323A1 - Peekable User Interface On a Portable Electronic Device - Google Patents
Peekable User Interface On a Portable Electronic Device Download PDFInfo
- Publication number
- US20130222323A1 US20130222323A1 US13/404,308 US201213404308A US2013222323A1 US 20130222323 A1 US20130222323 A1 US 20130222323A1 US 201213404308 A US201213404308 A US 201213404308A US 2013222323 A1 US2013222323 A1 US 2013222323A1
- Authority
- US
- United States
- Prior art keywords
- touch
- display
- electronic device
- portable electronic
- user interface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/169—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3206—Monitoring of events, devices or parameters that trigger a change in power modality
- G06F1/3215—Monitoring of peripheral devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3234—Power saving characterised by the action undertaken
- G06F1/325—Power saving in peripheral device
- G06F1/3265—Power saving in display device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/04—Supports for telephone transmitters or receivers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1633—Protecting arrangement for the entire housing of the computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04803—Split screen, i.e. subdividing the display area or the window area into separate subareas
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/22—Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/50—Reducing energy consumption in communication networks in wire-line communication networks, e.g. low power modes or reduced link rate
Definitions
- the present disclosure relates generally to displaying user interface information on a portable electronic device.
- Touch-sensitive displays have become more prevalent in portable electronic devices and providing an information display and interaction interface. Touch-sensitive displays enable users to interact with the device using numerous interaction points rather than a fixed binary button configuration. However, the portable electronic device can be deactivated in a sleep mode or locked by the user, when placed in a protective case, or when the display is covered requiring the user to remove or uncover the display and perform an unlock or wake gesture in order to access content.
- FIG. 1 shows a representation of portable electronic device and a sleeve type covering apparatus
- FIG. 2 shows a representation of the peekable user interface when the portable electronic device is in the covering apparatus using a single finger gesture
- FIG. 3 shows another representation of the peekable user interface when the portable electronic device is in the covering apparatus using a single finger gesture
- FIG. 4 shows a representation of the peekable user interface when the portable electronic device is in a covering apparatus using a two finger gesture
- FIG. 5 shows a representation of the peekable user interface when the device is not in the covering apparatus using a single finger gesture
- FIG. 6 shows a representation of a portable electronic device and a pocket pouch type covering apparatus
- FIG. 7 shows a representation of the portable electronic device in the pocket pouch covering apparatus
- FIG. 8 shows a representation of the portable electronic device in the pocket pouch covering apparatus in a first display position
- FIG. 9 shows a representation of the portable electronic device in the pocket pouch covering apparatus in a second display position
- FIG. 10 shows a method of displaying a user interface when inserted in a covering apparatus
- FIG. 11 shows another method of displaying a user interface when inserted in a covering apparatus
- FIG. 12 shows a method of displaying a user interface based upon an application context
- FIG. 13 shows a block diagram of a portable electronic device in accordance with an example embodiment
- FIG. 14 shows a front view of an example of a portable electronic device
- FIG. 15 shows examples of touches on the portable electronic device of FIG. 14 .
- a method of displaying a user interface on a portable electronic device comprising: detecting an input corresponding to displacement of a covering apparatus, the displacement uncovering a portion of a display of the portable electronic device, the display being in a low power condition; and illuminating at least the uncovered portion of the display and displaying the user interface, the user interface presenting information that is determined at least in part by the extent of the displacement.
- a portable electronic device comprising a touch-sensitive display; a processor coupled to the touch-sensitive display; a memory coupled to the processor containing instructions which when executed by the processor perform: detecting an input corresponding to displacement of a covering apparatus, the displacement uncovering a portion of the touch-sensitive display of the portable electronic device, the touch-sensitive display being in a low power condition; and illuminating at least the uncovered portion of the touch-sensitive display and displaying the user interface, the user interface presenting information that is determined at least in part by the extent of the displacement.
- a computer readable memory containing instructions for presenting a user interface on a portable electronic device, the instructions which when executed by a processor performing the method comprising: detecting an input corresponding to displacement of a covering apparatus, the displacement uncovering a portion of a display of the portable electronic device, the display being in a low power condition;
- the user interface presenting information that is determined at least in part by the extent of the displacement.
- a portable electronic device When a user inserts a portable electronic device in a covering apparatus such as a sleeve or a pouch, or covers the display of the device with a cover, the device is typically placed into a low power condition and locked for security or to not allow inadvertent interaction with the device.
- a covering apparatus such as a sleeve or a pouch
- the device is typically placed into a low power condition and locked for security or to not allow inadvertent interaction with the device.
- tasks such as checking the time, check if new e-mail has arrive, social network status changes or media playback status where the users only needs to momentarily access information on the device and performing an unlock and then navigating to a particular application to retrieve information can be cumbersome and inconvenient.
- the present disclosure provides a user interface (UI) to present information commonly accessed by the user for brief segments that is contextual to the display of the device being partially uncovered and is activated an input from the covering apparatus or a touch contact, interface gestures or meta-navigation gestures.
- the portable electronic device detects that it is in a covering apparatus, for example using the magnetic case hall effect sensor that activates holster events, when in the covering apparatus, the device detects input to activate a special UI when in “case mode” and illuminate the display.
- the UI displays a simplified view of the type of information a user would want to access quickly without having to remove it from the case (time, last message received etc).
- a special UI is shown giving a limited and more targeted display of information based upon a touch contact, interface gesture, meta-navigation gesture or movement of the device or cover to display a portion of the display. Different information may be presented depending on an application being executed on the device such as a media player. In addition different types of display contact would display different types of information depending on the amount of display area that is visible from the case.
- “peeking” at the display with one finger could show the time, two fingers could display further information such as a message list or a media play song list if audio is being playback on the device, with the assumption that more display area would be visible.
- the user of different covering apparatus design such as sleeve cases, pouch cases, removable screen covers, or folio designs could portray the information in a different manner.
- a foldable or articulating case or display cover may display different amount of information based upon the portion of the cover that has been folded back and the orientation of the case portrait vs. landscape would define how and where the display area would be visible to the user when ‘peeking’ at the display.
- FIG. 1 shows a representation of portable electronic device 100 and a covering apparatus 110 .
- the portable electronic device 100 is a tablet type form factor and the covering apparatus is a sleeve type case made of a flexible material in which the portable electronic device 100 is inserted through a lengthwise opening 112 .
- the touch-display 102 of the portable electronic device 100 is covered by the covering apparatus 110 material when placed inside.
- FIG. 2 shows a representation of the peekable user interface when the portable electronic device is in the covering apparatus using a single finger gesture.
- the portable electronic device 100 When the portable electronic device 100 is placed in the covering apparatus sleeve case 110 , the portable electronic device 100 is in a sleep, low power condition or locked state until some form of user interaction occurs in which a unlock screen would be presented.
- the state of the portable electronic device 100 may be based upon an action performed by the user prior to placing the portable electronic device 100 in the case such as locking the device or placing the device in a sleep mode, or by the device detecting the insertion into a case 110 by one or more sensors on the portable electronic device 100 .
- the portable electronic device 100 senses an input such as a touch contact which may be part of a gesture on the touch-sensitive display 102 , while within the case, and illuminates the display to display content or information 202 based upon the input in the uncovered portion of the touch-sensitive display 102 .
- a single finger 210 touch contact would present information of a first information type such the time 202 which would easily fit within the displayable screen area.
- the portable electronic device 100 can display content based upon the amount of display real estate display that would be visible based upon the type of case 110 and the position of the touch contact or gesture received.
- the covering apparatus is made of a material with sufficient insulating properties that does not allow user touch contact to be conveyed through the case itself to ensure that false touch contacts are not generated.
- FIG. 3 is similar to FIG. 2 as a single finger gesture is performed on the touch-sensitive display 102 of the device; however the display is resized to account for the larger portion of display real-estate 302 that the show but the same information is show. However the information itself may be resized or scaled based upon the determined display room available. The device may only render or activate the display area that would be visible based upon the case type.
- FIG. 4 shows a representation of the peekable user interface when the portable electronic device is in a covering apparatus using a two finger touch contact.
- a two finger touch contact 410 412 is performed by the user to pull do down the covering apparatus from the display.
- more or different information of a second information type is displayed in the UI 402 such as a list of e-mail message received on the device 100 .
- the user may not be able to interact with the content as the device may be locked and can only show certain content items.
- the content that is displayed based upon the touch contact may be application dependent such as showing media playlist if the device is playing media content.
- gestures or meta-navigation gestures associated with the touch contact may be utilized to determine the information in the UI to be show. For example, if the left finger 412 is lower on the display than the right finger 410 the device may show social networking information where as if the fingers 410 412 are approximate at the same level e-mail information may be displayed or if the touch contact is associated with a direction of the swipe on the display, different information may be presented based upon he direction.
- FIG. 5 shows a representation of the peekable user interface when the device is not in the covering apparatus using a single finger touch contact as an example of the rendering of the display that would occur when within the case.
- a single finger touch contact point defines a display region based upon the type of case, in this example a sleeve type case. Only a portion of the display 102 is illuminated or activated 502 based upon a single finger 510 touch contact input as would be defined by the movement of the material of the case. Other portions of the display 102 such as on the left 520 and right 522 sides of the active display portion may remain inactive or not display any content to conserve power resources.
- a two finger touch contact input may occur, to define a quadrilateral display area based upon the touch contact position relative to an orientation of one of the display edges.
- the display area may be defined based upon the touch contact position relative to a top edge of the display or based on a gesture input defining the starting positions of the gesture and the end positions of the gesture creating the quadrilateral display area.
- the contact points on the non-display area would be used to identify the start position, or top, of the quadrilateral display area.
- FIG. 6 shows a representation of a portable electronic device 600 and a pocket pouch type covering apparatus 610 .
- the portable electronic device 600 is of a small form factor such as a smart phone or mobile device having a touch-sensitive display 602 .
- a pocket pouch type case is shown 610 has a capacitive element 612 embedded within the case which will contact the touch-sensitive display 602 of the portable electronic device 600 as shown in FIG. 7 to provide the input to determine the displacement of the display 602 .
- the capacitive element may not contact a touch-sensitive portion of the display but be positioned above the display 602 on an inactive portion of the device 600 .
- the pocket pouch type covering apparatus may also be provided with a strap to cover over the top of the device 600 .
- FIG. 8 shows a representation of the portable electronic device 600 in the pocket pouch covering apparatus 610 in a first display position 802 .
- the device 600 is displaced partially out the covering apparatus 610 .
- the touch-sensitive display 602 determines the relative position of the capacitive element 612 along the length to a first position 802 of the touch-sensitive display 602 and the portion of visible display area.
- Information to be displayed in the user interface is determined and displayed for a first information type such as time information 804 and the display is illuminated with the information.
- alternate or additional information of a second information type can be displayed 904 such as e-mail items as shown in FIG. 9 , or other changeable status information or content such as social network status updates, text message, phone message, weather, media player status, or other updatable content.
- a finger touch contact or gesture input may also be utilized to determine the information for display when inserted in the case.
- the distance of the finger contact input along the length of the display rather than a number of fingers sensed may be used due to the smaller display 602 size.
- the information displayed may be shown based upon the distance by which the keyboard is exposed and not require an unlock to be performed to view information. For example if the keyboard is partially visible defining the input to define a display area, the time may be shown on the display while it is exposed, and if more of the keyboard is visible other content such as recent e-mails may be shown.
- the relative position of the display over the keyboard defines the input that may result in different content being displayed on the display of the device until it is in a fully visible position or an unlock is performed.
- FIG. 10 show a method 1000 of providing a peekable user interface on a portable electronic device.
- the portable electronic device is inserted in to or covered by the covering apparatus and enters a low power condition ( 1002 ).
- a low power condition 1002
- the device is placed in a sleep mode by the user or by the portable electronic device detecting the insertion by a sensor, such as a magnetic sensor, or by changes in conditions around the portable electronic device such as light.
- the device detects an input by displacement of the covering apparatus.
- the input may be detected from movement of the covering apparatus or by a touch input or contact on the display or by an input sensor of the portable electronic device, while a portion of the display is visible and a remaining portion is covered by the covering apparatus ( 1004 ).
- the device or the case To enable contact with the display, the device or the case must be moved to uncover a portion of the display to be visible and enable contact with the display.
- a covering apparatus such as a sleeve case
- a portion of the case may be slid downward by the user
- the device may be pulled upward to display a portion of the display, and a foldable segmented cover may be partially folded away from the display.
- the visible or displaced portion of the display is then illuminated, exiting a low power condition, and the user interface is then displayed on the touch-sensitive display while detecting the input, the user interface information is determined based on a position of the input on the display ( 1006 ) enabling the user to readily access peekable information without unlocking or complete uncovering or removing of the device from the covering apparatus. If the input is removed by re-covering with the protective apparatus it will re-enter a low power condition.
- multiple capacitive elements may be embedded in the cover where the device detects the number of touch contacts, for example two out of three cover segments are covering the display and therefore only a third portion of the display is visible.
- FIG. 11 show a method 1100 of providing a peekable user interface as an expansion of method 1000 .
- the portable electronic device detects that it is covered by a covering apparatus and enters a low power condition ( 1102 ).
- the device detects an input such as touch contact on a portion of the touch-sensitive display ( 1104 ) which may be by a user finger, a capacitive element provided in the case, or by an input sensor.
- the device may have to be partially removed from the case or the cover moved away from the display making a portion of the display visible.
- the touch contact may be a single or multiple inputs depending on the case, or device configuration, such as a single, double, or triple finger contact.
- the position of the input or contact(s) within the display is determined ( 1106 ).
- the information for display is determined based upon the input ( 1108 ). For example a single contact may be for displaying the time, while a double may be for displaying recent e-mail, or triple input for displaying social network status updates.
- the information may also be based upon the amount of display area determined to be visible by the touch contact based upon the case type of the covering apparatus. For example a sleeve type case a single touch contact may only expose a triangular display portion while in a pouch type covering apparatus a single contact may be possible to show a significant portion of the display as the device is slid out of the pouch.
- the determined information is then displayed by illuminating at least the portion of the display that is visible ( 1110 ).
- the information may be formatted to be displayed with viewable display area. If the user maintains the contact with the display (YES at 1112 ) the information continues to be displayed, when contact changes and the input is not maintained (NO at 1112 ) the device re-enters the low power condition ( 1114 ). However if a new position or different touch contact is detected the change is detected ( 1104 ) and different information is presented, and the low power condition may not be initiated rather a different portion may be illuminated based upon the information to be displayed. During the display of the information in the user interface the user may not be able to interact with the information or may be presented with limited options or functions.
- FIG. 12 shows a method 1200 of providing a peekable user interface in a covering apparatus based upon an application context as an expansion of method 1100 .
- the portable electronic device detects that it has been in a covering apparatus and enters a low power condition ( 1202 ).
- the type of covering apparatus is determined ( 1204 ) either based upon a predefined selection or by one or more sensors of the device which detect the type of case. For example the user may pre-select that a sleeve case will always be used, such that that the device may utilize a particular sensor to determine when it is inserted in the case, such as a light sensor or assume that it is in a case when it enters a locked or sleep mode.
- the device may detect the case by a magnetic sensor or by receiving a radio frequency identifier (RFID) which would identify the type of case when inserted.
- RFID radio frequency identifier
- the device detects input such as a touch contact on the touch-sensitive display ( 1206 ) which may be by one or more finger contacts or by one or more capacitive elements provided in the covering apparatus. In order for the input to occur the device may have to be partially removed from the case or the case moved away from the display.
- the touch contact may be a single or multiple inputs depending on the case, or device configuration, such as a single, double, or triple finger contact.
- the position of the input within the display is determined ( 1208 ).
- the visible display area defined by the input position can then be determined ( 1210 ).
- a single input contact for a sleeve case would define a triangular display area from the top corners of the display where two touch contacts would define a quadrilateral shaped display.
- An application or program state of the device such as a current running application such as a media player, the last execute application such as a e-mail program, or a program that has recently generated an alert such as a text message or an incoming phone call is determined ( 1212 ).
- information associated with the determined application state is determined ( 1214 ) of information types such as time, recent messages, application status information, or social networking status updates for example.
- the amount of information may also be scaled based upon the display area visible based upon the input position.
- the determined information is then displayed ( 1216 ) on an illuminated portion of the visible portion of the display.
- the portion of the display that is not visible may not be illuminated to conserve power resources. If the user maintains the contact with the display (YES at 1218 ) the information continues to be displayed, when contact changes and the input is not maintained (NO at 1218 ) the device re-enters the low power condition ( 1220 ). However if a new position or different touch contact is detected the change is detected ( 1206 ) and different information is presented, and the low power condition may not be initiated rather a different portion may be illuminated based upon the information to be displayed.
- the information that may be displayed while being peeked may be information types that a user may want to know about but may not necessarily want to interact with.
- the information types may be such as but not limited to date and time, currently running application information, received messages, missed calls, alerts, progress information, state information, social networking status information, text or instant messaging information, navigation or location information, or media playback state or playlist information.
- Each type of information may be determined for display by one or more parameters such as the display area visible, position of the touch contacts, the order that the touch contacts are received, the gesture or meta-navigation gesture associated with the touch contact, orientation of the device, the type of covering apparatus or operational state of the portable electronic device.
- FIG. 13 show a block diagram of a portable electronic device 100 / 600 in accordance with an example embodiment.
- a processor 1302 a multiple core processor or multiple processors, may interface with component or modules of the device to provide functionality required.
- a touch-sensitive display 1318 is coupled to the processor 1302 .
- the touch-sensitive display 1318 may be any suitable touch-sensitive display, such as a capacitive, resistive, infrared, surface acoustic wave (SAW) touch-sensitive display, strain gauge, optical imaging, dispersive signal technology, acoustic pulse recognition, and so forth, as known in the art.
- the touch-sensitive display 1318 is a capacitive touch-sensitive display which includes a capacitive touch-sensitive overlay 1314 .
- the overlay 1314 may be an assembly of multiple layers in a stack which may include, for example, a substrate, a ground shield layer, a barrier layer, one or more capacitive touch sensor layers separated by a substrate or other barrier, and a cover.
- the capacitive touch sensor layers may be any suitable material, such as patterned indium tin oxide (ITO).
- the portable electronic device 100 / 600 is maintained in a low-power condition, for example, by displaying no information on the display 1312 of the touch-sensitive display 1318 , e.g., the display 1312 is blank or black with no pixels illuminated.
- the processing activities of the device 100 / 600 are typically significantly reduced during a low-power condition.
- Minimal touch sensing is active on the touch-sensitive display 1318 , such that power usage is minimal. For example, scanning of touch sensors may take place every 100 to 500 ms or at a reduced rate from active touch sensing when in low-power condition.
- the input may be a gesture such as a simple touch or a touch that moves, a contact with a screen, contact of capacitive elements or sensors for detecting a case position apparatus position.
- the gesture may be simple or complex.
- the gesture may be a swipe that moves in a single direction along the display or a touch that hovers or is maintained at or near the same location. Any other gesture may be utilized.
- the gesture may begin anywhere on the touch-sensitive display 1318 , although advantage may be gained, for example, by detecting a touch starting at any edge of the display, such as the bottom of the display or a corner of the display.
- the gesture may be a series or sequence of taps on the touch-sensitive display 1318 . The location of the taps may or may not be relevant to detecting the gesture.
- the processor 1302 interface with memory 1310 providing an operating system 1346 and programs or applications 1348 providing instructions for execution by the processor 1302 .
- Random access memory 1308 is provided for the execution of the instructions and for processing data to be sent to or received from various components of the device.
- Various input/out devices or sensors may be provided such as an accelerometer 1336 , light sensor 1338 , magnetic sensor 1340 such as a hall effect sensor, and one or more cameras 1342 which may be used for detection of a covering apparatus type or the presence or position of the covering apparatus.
- a communication subsystem 1304 is provided for enabling data to be sent or received with a local area network 1350 or wide area network utilizing different physical layer and access technology implementations.
- a subscriber identity module or removable user identity module 1362 may be provided depending on the requirement of the particular network access technology to provide user access or identify information.
- Short-range communications 1332 may also be provided and may include near-field communication (NFC), radio frequency identifier (RFID), Bluetooth technologies.
- the device may also be provided with a data port 1326 and auxiliary input/output interface for sending and receiving data.
- a microphone 1330 and speaker 1328 may also be provided to enable audio communications via the device 100 .
- the display 1312 of the touch-sensitive display 1318 includes a display area in which information may be displayed, and a non-display area extending around the periphery of the display area. Information is not displayed in the non-display area, which is utilized to accommodate, for example, electronic traces or electrical connections, adhesives or other sealants, and/or protective coatings around the edges of the display area.
- One or more touches may be detected by the touch-sensitive display 1318 .
- the processor 1302 may determine attributes of the touch, including a location of a touch.
- Touch location data may include an area of contact or a single point of contact, such as a point at or near a center of the area of contact, known as the centroid.
- a signal is provided to the controller 1316 in response to detection of a touch.
- a touch may be detected from any suitable object, such as a finger, thumb, appendage, or other items, for example, a stylus, pen, or other pointers, depending on the nature of the touch-sensitive display 1318 .
- the location of the touch moves as the detected object moves during a touch.
- the controller 1316 and/or the processor 1302 may detect a touch by any suitable contact member on the touch-sensitive display 1318 . Similarly, multiple simultaneous touches are detected.
- One or more gestures are also detected by the touch-sensitive display 1318 .
- a gesture is a particular type of touch on a touch-sensitive display 1318 that begins at an origin point and continues to an end point.
- a gesture may be identified by attributes of the gesture, including the origin point, the end point, the distance travelled, the duration, the velocity, and the direction, for example.
- a gesture may be long or short in distance and/or duration. Two points of the gesture may be utilized to determine a direction of the gesture.
- a swipe also known as a flick
- a swipe has a single direction.
- the touch-sensitive overlay 1314 may evaluate swipes with respect to the origin point at which contact is initially made with the touch-sensitive overlay 1314 and the end point at which contact with the touch-sensitive overlay 1314 ends rather than using each of location or point of contact over the duration of the gesture to resolve a direction.
- swipes include a horizontal swipe, a vertical swipe, and a diagonal swipe.
- a horizontal swipe typically comprises an origin point towards the left or right side of the touch-sensitive overlay 1314 to initialize the gesture, a horizontal movement of the detected object from the origin point to an end point towards the right or left side of the touch-sensitive overlay 1314 while maintaining continuous contact with the touch-sensitive overlay 1314 , and a breaking of contact with the touch-sensitive overlay 1314 .
- a vertical swipe typically comprises an origin point towards the top or bottom of the touch-sensitive overlay 1314 to initialize the gesture, a horizontal movement of the detected object from the origin point to an end point towards the bottom or top of the touch-sensitive overlay 1314 while maintaining continuous contact with the touch-sensitive overlay 1314 , and a breaking of contact with the touch-sensitive overlay 1314 .
- Swipes can be of various lengths, can be initiated in various places on the touch-sensitive overlay 1314 , and need not span the full dimension of the touch-sensitive overlay 1314 .
- breaking contact of a swipe can be gradual in that contact with the touch-sensitive overlay 1314 is gradually reduced while the swipe is still underway.
- Meta-navigation gestures may also be detected by the touch-sensitive overlay 1314 .
- a meta-navigation gesture is a gesture that has an origin point that is outside the display area of the touch-sensitive overlay 1314 and that moves to a position on the display area of the touch-sensitive display. Other attributes of the gesture may be detected and be utilized to detect the meta-navigation gesture.
- Meta-navigation gestures may also include multi-touch gestures in which gestures are simultaneous or overlap in time and at least one of the touches has an origin point that is outside the display area and moves to a position on the display area of the touch-sensitive overlay 1314 . Thus, two fingers may be utilized for meta-navigation gestures.
- multi-touch meta-navigation gestures may be distinguished from single touch meta-navigation gestures and may provide additional or further functionality or be used to distinguish between types of information that the user may required to be displayed on the device which would also be dependent on the display area available for display.
- an optional force sensor 1322 or force sensors is disposed in any suitable location, for example, between the touch-sensitive display 1318 and a back of the portable electronic device 100 to detect a force imparted by a touch on the touch-sensitive display 1318 .
- the force sensor 1322 may be a force-sensitive resistor, strain gauge, piezoelectric or piezoresistive device, pressure sensor, or other suitable device.
- Force as utilized throughout the specification refers to force measurements, estimates, and/or calculations, such as pressure, deformation, stress, strain, force density, force-area relationships, thrust, torque, and other effects that include force or related quantities.
- Force information related to a detected touch may be utilized to select information, such as information associated with a location of a touch. For example, a touch that does not meet a force threshold may highlight a selection option, whereas a touch that meets a force threshold may select or input that selection option. Different magnitudes of force may be associated with different functions or input. For example, a lesser force may result in displaying only time information in the user interface information, and a higher force may result in the display of e-mail information in the user interface information.
- FIG. 14 is a front view of an example of a portable electronic device 100 .
- the portable electronic device 100 includes a housing 1402 that encloses components such as shown in FIG. 13 .
- the housing 1402 may include a back, sidewalls, and a front 1404 that frames the touch-sensitive display 1318 .
- the touch-sensitive display 1318 is generally centered in the housing 1402 such that a display area 1406 of the touch-sensitive overlay 1314 is generally centered with respect to the front 1404 of the housing 1402 .
- the non-display area 1408 of the touch-sensitive overlay 1314 extends around the display area 1406 .
- the width of the non-display area is 4 mm.
- the touch-sensitive overlay 1314 extends to cover the display area 1406 and the non-display area 1408 .
- Touches on the display area 1406 may be detected and, for example, may be associated with displayed selectable features.
- Touches on the non-display area 1408 may be detected, for example, to detect a meta-navigation gesture.
- meta-navigation gestures may be determined by both the non-display area 1408 and the display area 1406 .
- the density of touch sensors may differ from the display area 1406 to the non-display area 1408 .
- the density of nodes in a mutual capacitive touch-sensitive display, or density of locations at which electrodes of one layer cross over electrodes of another layer may differ between the display area 1406 and the non-display area 1408 .
- Gestures received on the touch-sensitive display 1318 may be analyzed based on the attributes to discriminate between meta-navigation gestures and other touches, or non-meta navigation gestures.
- Meta-navigation gestures may be identified when the gesture crosses over a boundary near a periphery of the display 112 , such as a boundary 1410 between the display area 1406 and the non-display area 1408 .
- the origin point of a meta-navigation gesture may be determined utilizing the area of the touch-sensitive overlay 1314 that covers the non-display area 1408 .
- a buffer region 1412 or band that extends around the boundary 1410 between the display area 1406 and the non-display area 1408 may be utilized such that a meta-navigation gesture is identified when a touch has an origin point outside the boundary 1410 and the buffer region 1412 and crosses through the buffer region 1412 and over the boundary 1410 to a point inside the boundary 1410 .
- the buffer region 1412 may not be visible. Instead, the buffer region 1412 may be a region around the boundary 1410 that extends a width that is equivalent to a predetermined number of pixels, for example.
- the boundary 1410 may extend a predetermined number of touch sensors or may extend a predetermined distance from the display area 1406 .
- the boundary 1410 may be a touch-sensitive region or may be a region in which touches are not detected.
- Gestures that have an origin point in the buffer region 1412 may be identified as non-meta navigation gestures.
- data from such gestures may be utilized by an application as a non-meta navigation gesture.
- data from such gestures may be discarded such that touches that have an origin point on the buffer region 1412 are not utilized as input at the portable electronic device 100 .
- FIG. 15 illustrates examples of touches on the portable electronic device of FIG. 15 .
- FIG. 15 illustrates examples of touches on the touch-sensitive display 1318 .
- the buffer region 1412 is illustrated in FIG. 15 by hash markings for the purpose of explanation. As indicated, the buffer region 1412 may not be visible to the user.
- touches are illustrated by circles at their points of origin. Arrows extending from the circles illustrate the paths of the touches that are gestures.
- the touch 1502 begins at the origin point outside the boundary 1410 and the outside the buffer region 1412 .
- the path of the touch 1502 crosses the buffer region 1412 and the boundary 1410 and is therefore identified as a meta-navigation gesture.
- the touches 1506 , 1508 , and 1510 each have origin points outside the boundary 1410 and the buffer region 1412 and their paths cross the buffer region 1412 and the boundary 1410 .
- Each of the touches 1506 , 1508 and 1510 is therefore identified as a meta-navigation gesture.
- a single touch contact may also be provided 1514 without a vector or motion provided, for example the user moving the covering apparatus away from the device and then touching a point on the display area 1406 .
- the touch contact 1504 and 1508 may be defined as a vector having a start point and an end point within the display area.
- the combined inputs would be used to define a display area relative to an outer edge of the device when the touch contact is held defining when the covering apparatus is being peeked.
- a touch contact 1512 that does not enter the display area 1406 may not activate the display of user interface as not definable area would be visible.
- touch contact 1512 may be combined with another touch contact such as touch contact 1502 would define a displayable area for user interface information.
- the touch contact 1516 may also be defined as a region extending across the surface of the display such as provided by a capacitive elements provided within the case.
- the display area may be defined from the position of the touch contact 1506 parallel to an edge of the device 1402 and may be defined in a landscape or portrait mode depending on the aspect ratios of the device 1402 .
- the device is describe as having a touch-sensitive non-display area, the device may also not have a touch-sensitive non-display are in embodiments of the present disclosure.
- capacitive display technologies are described in the above examples other types of display sensing technologies may be utilized to identify gesture or movement of the portable electronic device within the case.
- any suitable computer readable memory can be used for storing instructions for performing the processes described herein.
- computer readable media can be transitory or non-transitory.
- non-transitory computer readable media can include non-volatile computer storage memory or media such as magnetic media (such as hard disks), optical media (such as compact discs, digital video discs, Blu-ray discs, etc.), semiconductor media (such as flash memory, read only memory (ROM), Flash memory, electrically programmable read only memory (EPROM), electrically erasable programmable read only memory (EEPROM), etc.), any suitable media that is not fleeting or devoid of any semblance of permanence during transmission, and/or any suitable tangible media.
- magnetic media such as hard disks
- optical media such as compact discs, digital video discs, Blu-ray discs, etc.
- semiconductor media such as flash memory, read only memory (ROM), Flash memory, electrically programmable read only memory (EPROM), electrically erasable programmable read only memory (EEPROM), etc.
- transitory computer readable media can include signals on networks, in wires, conductors, optical fibers, circuits, and any suitable media that is fleeting and devoid of any semblance of permanence during transmission, and/or any suitable intangible media.
Abstract
Description
- The present disclosure relates generally to displaying user interface information on a portable electronic device.
- Touch-sensitive displays, or touch screens, have become more prevalent in portable electronic devices and providing an information display and interaction interface. Touch-sensitive displays enable users to interact with the device using numerous interaction points rather than a fixed binary button configuration. However, the portable electronic device can be deactivated in a sleep mode or locked by the user, when placed in a protective case, or when the display is covered requiring the user to remove or uncover the display and perform an unlock or wake gesture in order to access content.
- Accordingly, there is a need for effectively viewing information on a portable electronic device.
- Further features and advantages of the present disclosure will become apparent from the following detailed description, taken in combination with the appended drawings, in which:
-
FIG. 1 shows a representation of portable electronic device and a sleeve type covering apparatus; -
FIG. 2 shows a representation of the peekable user interface when the portable electronic device is in the covering apparatus using a single finger gesture; -
FIG. 3 shows another representation of the peekable user interface when the portable electronic device is in the covering apparatus using a single finger gesture; -
FIG. 4 shows a representation of the peekable user interface when the portable electronic device is in a covering apparatus using a two finger gesture; -
FIG. 5 shows a representation of the peekable user interface when the device is not in the covering apparatus using a single finger gesture; -
FIG. 6 shows a representation of a portable electronic device and a pocket pouch type covering apparatus; -
FIG. 7 shows a representation of the portable electronic device in the pocket pouch covering apparatus; -
FIG. 8 shows a representation of the portable electronic device in the pocket pouch covering apparatus in a first display position; -
FIG. 9 shows a representation of the portable electronic device in the pocket pouch covering apparatus in a second display position; -
FIG. 10 shows a method of displaying a user interface when inserted in a covering apparatus; -
FIG. 11 shows another method of displaying a user interface when inserted in a covering apparatus; -
FIG. 12 shows a method of displaying a user interface based upon an application context; -
FIG. 13 shows a block diagram of a portable electronic device in accordance with an example embodiment; -
FIG. 14 shows a front view of an example of a portable electronic device; and -
FIG. 15 shows examples of touches on the portable electronic device ofFIG. 14 . - It will be noted that throughout the appended drawings, like features are identified by like reference numerals.
- In accordance with an aspect of the present disclosure there is provided a method of displaying a user interface on a portable electronic device, the method comprising: detecting an input corresponding to displacement of a covering apparatus, the displacement uncovering a portion of a display of the portable electronic device, the display being in a low power condition; and illuminating at least the uncovered portion of the display and displaying the user interface, the user interface presenting information that is determined at least in part by the extent of the displacement.
- In accordance with another aspect of the present disclosure there is provided a portable electronic device comprising a touch-sensitive display; a processor coupled to the touch-sensitive display; a memory coupled to the processor containing instructions which when executed by the processor perform: detecting an input corresponding to displacement of a covering apparatus, the displacement uncovering a portion of the touch-sensitive display of the portable electronic device, the touch-sensitive display being in a low power condition; and illuminating at least the uncovered portion of the touch-sensitive display and displaying the user interface, the user interface presenting information that is determined at least in part by the extent of the displacement.
- In accordance with yet another aspect of the present disclosure there is provided a computer readable memory containing instructions for presenting a user interface on a portable electronic device, the instructions which when executed by a processor performing the method comprising: detecting an input corresponding to displacement of a covering apparatus, the displacement uncovering a portion of a display of the portable electronic device, the display being in a low power condition;
- and illuminating at least the uncovered portion of the display and displaying the user interface, the user interface presenting information that is determined at least in part by the extent of the displacement.
- Although the following description discloses example methods and apparatus including, among other components, software executed on hardware, it should be noted that such methods and apparatus are merely illustrative and should not be considered as limiting. For example, it is contemplated that any or all of these hardware and software components could be embodied exclusively in hardware, exclusively in software, exclusively in firmware, or in any combination of hardware, software, and/or firmware. Accordingly, while the following describes example methods and apparatus, persons having ordinary skill in the art will readily appreciate that the examples provided are not the only way to implement such methods and apparatus.
- It will be appreciated that for simplicity and clarity of illustration, where considered appropriate, reference numerals may be repeated among the figures to indicate corresponding or analogous elements. In addition, numerous specific details are set forth in order to provide a thorough understanding of the embodiments described herein. However, it will be understood by those of ordinary skill in the art that the embodiments described herein may be practiced without these specific details. In other instances, well-known methods, procedures and components have not been described in detail so as not to obscure the embodiments described herein. Also, the description is not to be considered as limiting the scope of the embodiments described herein. Embodiments are described below, by way of example only, with reference to
FIGS. 1-15 . - When a user inserts a portable electronic device in a covering apparatus such as a sleeve or a pouch, or covers the display of the device with a cover, the device is typically placed into a low power condition and locked for security or to not allow inadvertent interaction with the device. However there are tasks such as checking the time, check if new e-mail has arrive, social network status changes or media playback status where the users only needs to momentarily access information on the device and performing an unlock and then navigating to a particular application to retrieve information can be cumbersome and inconvenient. The present disclosure provides a user interface (UI) to present information commonly accessed by the user for brief segments that is contextual to the display of the device being partially uncovered and is activated an input from the covering apparatus or a touch contact, interface gestures or meta-navigation gestures. The portable electronic device detects that it is in a covering apparatus, for example using the magnetic case hall effect sensor that activates holster events, when in the covering apparatus, the device detects input to activate a special UI when in “case mode” and illuminate the display. The UI displays a simplified view of the type of information a user would want to access quickly without having to remove it from the case (time, last message received etc). When the top of the case is stretched down (performing a gesture) to “peek” at the display, or when the device is pulled out of a case, or a cover lifted, a special UI is shown giving a limited and more targeted display of information based upon a touch contact, interface gesture, meta-navigation gesture or movement of the device or cover to display a portion of the display. Different information may be presented depending on an application being executed on the device such as a media player. In addition different types of display contact would display different types of information depending on the amount of display area that is visible from the case. For example, “peeking” at the display with one finger could show the time, two fingers could display further information such as a message list or a media play song list if audio is being playback on the device, with the assumption that more display area would be visible. The user of different covering apparatus design such as sleeve cases, pouch cases, removable screen covers, or folio designs could portray the information in a different manner. For example a foldable or articulating case or display cover may display different amount of information based upon the portion of the cover that has been folded back and the orientation of the case portrait vs. landscape would define how and where the display area would be visible to the user when ‘peeking’ at the display.
-
FIG. 1 shows a representation of portableelectronic device 100 and a coveringapparatus 110. In this example the portableelectronic device 100 is a tablet type form factor and the covering apparatus is a sleeve type case made of a flexible material in which the portableelectronic device 100 is inserted through alengthwise opening 112. The touch-display 102 of the portableelectronic device 100 is covered by the coveringapparatus 110 material when placed inside. -
FIG. 2 shows a representation of the peekable user interface when the portable electronic device is in the covering apparatus using a single finger gesture. When the portableelectronic device 100 is placed in the coveringapparatus sleeve case 110, the portableelectronic device 100 is in a sleep, low power condition or locked state until some form of user interaction occurs in which a unlock screen would be presented. The state of the portableelectronic device 100 may be based upon an action performed by the user prior to placing the portableelectronic device 100 in the case such as locking the device or placing the device in a sleep mode, or by the device detecting the insertion into acase 110 by one or more sensors on the portableelectronic device 100. Once the portableelectronic device 100 is inserted in the case the top portion of the opening of thecase 110 on top of thedisplay 102 can be moved by theuser finger 210. The portableelectronic device 100 senses an input such as a touch contact which may be part of a gesture on the touch-sensitive display 102, while within the case, and illuminates the display to display content orinformation 202 based upon the input in the uncovered portion of the touch-sensitive display 102. In this example asingle finger 210 touch contact would present information of a first information type such thetime 202 which would easily fit within the displayable screen area. The portableelectronic device 100 can display content based upon the amount of display real estate display that would be visible based upon the type ofcase 110 and the position of the touch contact or gesture received. When the touch contact is removed the touch-sensitive display 102 returns to the low power condition. The covering apparatus is made of a material with sufficient insulating properties that does not allow user touch contact to be conveyed through the case itself to ensure that false touch contacts are not generated. -
FIG. 3 is similar toFIG. 2 as a single finger gesture is performed on the touch-sensitive display 102 of the device; however the display is resized to account for the larger portion of display real-estate 302 that the show but the same information is show. However the information itself may be resized or scaled based upon the determined display room available. The device may only render or activate the display area that would be visible based upon the case type. -
FIG. 4 shows a representation of the peekable user interface when the portable electronic device is in a covering apparatus using a two finger touch contact. In this example when a twofinger touch contact 410 412 is performed by the user to pull do down the covering apparatus from the display. As the device senses multiple touch contacts and a larger display area is visible, and illuminated, more or different information of a second information type is displayed in theUI 402 such as a list of e-mail message received on thedevice 100. The user may not be able to interact with the content as the device may be locked and can only show certain content items. Alternatively the content that is displayed based upon the touch contact may be application dependent such as showing media playlist if the device is playing media content. In addition, gestures or meta-navigation gestures associated with the touch contact may be utilized to determine the information in the UI to be show. For example, if theleft finger 412 is lower on the display than theright finger 410 the device may show social networking information where as if thefingers 410 412 are approximate at the same level e-mail information may be displayed or if the touch contact is associated with a direction of the swipe on the display, different information may be presented based upon he direction. -
FIG. 5 shows a representation of the peekable user interface when the device is not in the covering apparatus using a single finger touch contact as an example of the rendering of the display that would occur when within the case. As shown a single finger touch contact point defines a display region based upon the type of case, in this example a sleeve type case. Only a portion of thedisplay 102 is illuminated or activated 502 based upon asingle finger 510 touch contact input as would be defined by the movement of the material of the case. Other portions of thedisplay 102 such as on the left 520 and right 522 sides of the active display portion may remain inactive or not display any content to conserve power resources. Similarly a two finger touch contact input may occur, to define a quadrilateral display area based upon the touch contact position relative to an orientation of one of the display edges. The display area may be defined based upon the touch contact position relative to a top edge of the display or based on a gesture input defining the starting positions of the gesture and the end positions of the gesture creating the quadrilateral display area. In the case of a touch-sensitive non-display area, as will be described in regards toFIGS. 14 and 15 and meta-navigation gestures, the contact points on the non-display area would be used to identify the start position, or top, of the quadrilateral display area. -
FIG. 6 shows a representation of a portableelectronic device 600 and a pocket pouchtype covering apparatus 610. In this example the portableelectronic device 600 is of a small form factor such as a smart phone or mobile device having a touch-sensitive display 602. A pocket pouch type case is shown 610 has acapacitive element 612 embedded within the case which will contact the touch-sensitive display 602 of the portableelectronic device 600 as shown inFIG. 7 to provide the input to determine the displacement of thedisplay 602. When the portableelectronic device 600 is fully inserted in thecase 610 the capacitive element may not contact a touch-sensitive portion of the display but be positioned above thedisplay 602 on an inactive portion of thedevice 600. The pocket pouch type covering apparatus may also be provided with a strap to cover over the top of thedevice 600. -
FIG. 8 shows a representation of the portableelectronic device 600 in the pocketpouch covering apparatus 610 in afirst display position 802. In this example thedevice 600 is displaced partially out thecovering apparatus 610. As thedevice 600 is only partially in the case the touch-sensitive display 602 determines the relative position of thecapacitive element 612 along the length to afirst position 802 of the touch-sensitive display 602 and the portion of visible display area. Information to be displayed in the user interface is determined and displayed for a first information type such astime information 804 and the display is illuminated with the information. If thecapacitive element 612 moves further along the touch-sensitive display 602 surface to asecond display position 902, alternate or additional information of a second information type can be displayed 904 such as e-mail items as shown inFIG. 9 , or other changeable status information or content such as social network status updates, text message, phone message, weather, media player status, or other updatable content. When more of the display is displaced additional area is illuminated to show the second information type. Although this example is shown using acapacitive element 612 in thecovering apparatus 610, a finger touch contact or gesture input may also be utilized to determine the information for display when inserted in the case. The distance of the finger contact input along the length of the display rather than a number of fingers sensed may be used due to thesmaller display 602 size. Alternatively, for a portable electronic device that is a slider type device where the keyboard is accessible by sliding the display of the device upwardly, the information displayed may be shown based upon the distance by which the keyboard is exposed and not require an unlock to be performed to view information. For example if the keyboard is partially visible defining the input to define a display area, the time may be shown on the display while it is exposed, and if more of the keyboard is visible other content such as recent e-mails may be shown. The relative position of the display over the keyboard defines the input that may result in different content being displayed on the display of the device until it is in a fully visible position or an unlock is performed. -
FIG. 10 show amethod 1000 of providing a peekable user interface on a portable electronic device. The portable electronic device is inserted in to or covered by the covering apparatus and enters a low power condition (1002). By covering of the display, by a cover or by insertion into a case, the device is placed in a sleep mode by the user or by the portable electronic device detecting the insertion by a sensor, such as a magnetic sensor, or by changes in conditions around the portable electronic device such as light. The device then detects an input by displacement of the covering apparatus. The input may be detected from movement of the covering apparatus or by a touch input or contact on the display or by an input sensor of the portable electronic device, while a portion of the display is visible and a remaining portion is covered by the covering apparatus (1004). To enable contact with the display, the device or the case must be moved to uncover a portion of the display to be visible and enable contact with the display. For example in a covering apparatus such as a sleeve case, a portion of the case may be slid downward by the user, while in a pouch type case the device may be pulled upward to display a portion of the display, and a foldable segmented cover may be partially folded away from the display. The visible or displaced portion of the display is then illuminated, exiting a low power condition, and the user interface is then displayed on the touch-sensitive display while detecting the input, the user interface information is determined based on a position of the input on the display (1006) enabling the user to readily access peekable information without unlocking or complete uncovering or removing of the device from the covering apparatus. If the input is removed by re-covering with the protective apparatus it will re-enter a low power condition. In a foldable or articulating cover type covering apparatus multiple capacitive elements may be embedded in the cover where the device detects the number of touch contacts, for example two out of three cover segments are covering the display and therefore only a third portion of the display is visible. -
FIG. 11 show amethod 1100 of providing a peekable user interface as an expansion ofmethod 1000. The portable electronic device detects that it is covered by a covering apparatus and enters a low power condition (1102). The device detects an input such as touch contact on a portion of the touch-sensitive display (1104) which may be by a user finger, a capacitive element provided in the case, or by an input sensor. In order for the input to be detected the device may have to be partially removed from the case or the cover moved away from the display making a portion of the display visible. The touch contact may be a single or multiple inputs depending on the case, or device configuration, such as a single, double, or triple finger contact. The position of the input or contact(s) within the display is determined (1106). The information for display is determined based upon the input (1108). For example a single contact may be for displaying the time, while a double may be for displaying recent e-mail, or triple input for displaying social network status updates. The information may also be based upon the amount of display area determined to be visible by the touch contact based upon the case type of the covering apparatus. For example a sleeve type case a single touch contact may only expose a triangular display portion while in a pouch type covering apparatus a single contact may be possible to show a significant portion of the display as the device is slid out of the pouch. The determined information is then displayed by illuminating at least the portion of the display that is visible (1110). The information may be formatted to be displayed with viewable display area. If the user maintains the contact with the display (YES at 1112) the information continues to be displayed, when contact changes and the input is not maintained (NO at 1112) the device re-enters the low power condition (1114). However if a new position or different touch contact is detected the change is detected (1104) and different information is presented, and the low power condition may not be initiated rather a different portion may be illuminated based upon the information to be displayed. During the display of the information in the user interface the user may not be able to interact with the information or may be presented with limited options or functions. -
FIG. 12 shows amethod 1200 of providing a peekable user interface in a covering apparatus based upon an application context as an expansion ofmethod 1100. The portable electronic device detects that it has been in a covering apparatus and enters a low power condition (1202). The type of covering apparatus is determined (1204) either based upon a predefined selection or by one or more sensors of the device which detect the type of case. For example the user may pre-select that a sleeve case will always be used, such that that the device may utilize a particular sensor to determine when it is inserted in the case, such as a light sensor or assume that it is in a case when it enters a locked or sleep mode. Alternatively the device may detect the case by a magnetic sensor or by receiving a radio frequency identifier (RFID) which would identify the type of case when inserted. The device detects input such as a touch contact on the touch-sensitive display (1206) which may be by one or more finger contacts or by one or more capacitive elements provided in the covering apparatus. In order for the input to occur the device may have to be partially removed from the case or the case moved away from the display. The touch contact may be a single or multiple inputs depending on the case, or device configuration, such as a single, double, or triple finger contact. The position of the input within the display is determined (1208). The visible display area defined by the input position can then be determined (1210). For example a single input contact for a sleeve case would define a triangular display area from the top corners of the display where two touch contacts would define a quadrilateral shaped display. An application or program state of the device, such as a current running application such as a media player, the last execute application such as a e-mail program, or a program that has recently generated an alert such as a text message or an incoming phone call is determined (1212). Depending on the preference configuration, and characteristics of the input such as the touch contact can be defined by a gesture type, position, number of contacts and/or display size shown, information associated with the determined application state is determined (1214) of information types such as time, recent messages, application status information, or social networking status updates for example. The amount of information may also be scaled based upon the display area visible based upon the input position. The determined information is then displayed (1216) on an illuminated portion of the visible portion of the display. The portion of the display that is not visible may not be illuminated to conserve power resources. If the user maintains the contact with the display (YES at 1218) the information continues to be displayed, when contact changes and the input is not maintained (NO at 1218) the device re-enters the low power condition (1220). However if a new position or different touch contact is detected the change is detected (1206) and different information is presented, and the low power condition may not be initiated rather a different portion may be illuminated based upon the information to be displayed. - During the display of the information in the user interface the user may not be able to interact with the content as the device is locked or the display information would change. The information that may be displayed while being peeked may be information types that a user may want to know about but may not necessarily want to interact with. For example the information types may be such as but not limited to date and time, currently running application information, received messages, missed calls, alerts, progress information, state information, social networking status information, text or instant messaging information, navigation or location information, or media playback state or playlist information. Each type of information may be determined for display by one or more parameters such as the display area visible, position of the touch contacts, the order that the touch contacts are received, the gesture or meta-navigation gesture associated with the touch contact, orientation of the device, the type of covering apparatus or operational state of the portable electronic device.
-
FIG. 13 show a block diagram of a portableelectronic device 100/600 in accordance with an example embodiment. Aprocessor 1302, a multiple core processor or multiple processors, may interface with component or modules of the device to provide functionality required. A touch-sensitive display 1318 is coupled to theprocessor 1302. The touch-sensitive display 1318 may be any suitable touch-sensitive display, such as a capacitive, resistive, infrared, surface acoustic wave (SAW) touch-sensitive display, strain gauge, optical imaging, dispersive signal technology, acoustic pulse recognition, and so forth, as known in the art. In the presently described example embodiment, the touch-sensitive display 1318 is a capacitive touch-sensitive display which includes a capacitive touch-sensitive overlay 1314. Theoverlay 1314 may be an assembly of multiple layers in a stack which may include, for example, a substrate, a ground shield layer, a barrier layer, one or more capacitive touch sensor layers separated by a substrate or other barrier, and a cover. The capacitive touch sensor layers may be any suitable material, such as patterned indium tin oxide (ITO). - The portable
electronic device 100/600 is maintained in a low-power condition, for example, by displaying no information on the display 1312 of the touch-sensitive display 1318, e.g., the display 1312 is blank or black with no pixels illuminated. The processing activities of thedevice 100/600 are typically significantly reduced during a low-power condition. Minimal touch sensing is active on the touch-sensitive display 1318, such that power usage is minimal. For example, scanning of touch sensors may take place every 100 to 500 ms or at a reduced rate from active touch sensing when in low-power condition. While the display 1312/touch-sensitive display 1318 is in low-power condition, an input is detected on the touch-sensitive display 1318 or by one or more sensors of the portableelectronic device 100/600, which at least minimally wakes-up the device. The input may be a gesture such as a simple touch or a touch that moves, a contact with a screen, contact of capacitive elements or sensors for detecting a case position apparatus position. The gesture may be simple or complex. For example, the gesture may be a swipe that moves in a single direction along the display or a touch that hovers or is maintained at or near the same location. Any other gesture may be utilized. The gesture may begin anywhere on the touch-sensitive display 1318, although advantage may be gained, for example, by detecting a touch starting at any edge of the display, such as the bottom of the display or a corner of the display. The gesture may be a series or sequence of taps on the touch-sensitive display 1318. The location of the taps may or may not be relevant to detecting the gesture. - The
processor 1302 interface withmemory 1310 providing anoperating system 1346 and programs orapplications 1348 providing instructions for execution by theprocessor 1302.Random access memory 1308 is provided for the execution of the instructions and for processing data to be sent to or received from various components of the device. Various input/out devices or sensors may be provided such as anaccelerometer 1336,light sensor 1338,magnetic sensor 1340 such as a hall effect sensor, and one ormore cameras 1342 which may be used for detection of a covering apparatus type or the presence or position of the covering apparatus. Acommunication subsystem 1304 is provided for enabling data to be sent or received with alocal area network 1350 or wide area network utilizing different physical layer and access technology implementations. A subscriber identity module or removableuser identity module 1362 may be provided depending on the requirement of the particular network access technology to provide user access or identify information. Short-range communications 1332 may also be provided and may include near-field communication (NFC), radio frequency identifier (RFID), Bluetooth technologies. The device may also be provided with adata port 1326 and auxiliary input/output interface for sending and receiving data. Amicrophone 1330 andspeaker 1328 may also be provided to enable audio communications via thedevice 100. - The display 1312 of the touch-
sensitive display 1318 includes a display area in which information may be displayed, and a non-display area extending around the periphery of the display area. Information is not displayed in the non-display area, which is utilized to accommodate, for example, electronic traces or electrical connections, adhesives or other sealants, and/or protective coatings around the edges of the display area. - One or more touches, also known as contact inputs, touch contacts or touch events, may be detected by the touch-
sensitive display 1318. Theprocessor 1302 may determine attributes of the touch, including a location of a touch. Touch location data may include an area of contact or a single point of contact, such as a point at or near a center of the area of contact, known as the centroid. A signal is provided to the controller 1316 in response to detection of a touch. A touch may be detected from any suitable object, such as a finger, thumb, appendage, or other items, for example, a stylus, pen, or other pointers, depending on the nature of the touch-sensitive display 1318. The location of the touch moves as the detected object moves during a touch. The controller 1316 and/or theprocessor 1302 may detect a touch by any suitable contact member on the touch-sensitive display 1318. Similarly, multiple simultaneous touches are detected. - One or more gestures are also detected by the touch-
sensitive display 1318. A gesture is a particular type of touch on a touch-sensitive display 1318 that begins at an origin point and continues to an end point. A gesture may be identified by attributes of the gesture, including the origin point, the end point, the distance travelled, the duration, the velocity, and the direction, for example. A gesture may be long or short in distance and/or duration. Two points of the gesture may be utilized to determine a direction of the gesture. - An example of a gesture is a swipe (also known as a flick). A swipe has a single direction. The touch-
sensitive overlay 1314 may evaluate swipes with respect to the origin point at which contact is initially made with the touch-sensitive overlay 1314 and the end point at which contact with the touch-sensitive overlay 1314 ends rather than using each of location or point of contact over the duration of the gesture to resolve a direction. - Examples of swipes include a horizontal swipe, a vertical swipe, and a diagonal swipe. A horizontal swipe typically comprises an origin point towards the left or right side of the touch-
sensitive overlay 1314 to initialize the gesture, a horizontal movement of the detected object from the origin point to an end point towards the right or left side of the touch-sensitive overlay 1314 while maintaining continuous contact with the touch-sensitive overlay 1314, and a breaking of contact with the touch-sensitive overlay 1314. Similarly, a vertical swipe typically comprises an origin point towards the top or bottom of the touch-sensitive overlay 1314 to initialize the gesture, a horizontal movement of the detected object from the origin point to an end point towards the bottom or top of the touch-sensitive overlay 1314 while maintaining continuous contact with the touch-sensitive overlay 1314, and a breaking of contact with the touch-sensitive overlay 1314. - Swipes can be of various lengths, can be initiated in various places on the touch-
sensitive overlay 1314, and need not span the full dimension of the touch-sensitive overlay 1314. In addition, breaking contact of a swipe can be gradual in that contact with the touch-sensitive overlay 1314 is gradually reduced while the swipe is still underway. - Meta-navigation gestures may also be detected by the touch-
sensitive overlay 1314. A meta-navigation gesture is a gesture that has an origin point that is outside the display area of the touch-sensitive overlay 1314 and that moves to a position on the display area of the touch-sensitive display. Other attributes of the gesture may be detected and be utilized to detect the meta-navigation gesture. Meta-navigation gestures may also include multi-touch gestures in which gestures are simultaneous or overlap in time and at least one of the touches has an origin point that is outside the display area and moves to a position on the display area of the touch-sensitive overlay 1314. Thus, two fingers may be utilized for meta-navigation gestures. Further, multi-touch meta-navigation gestures may be distinguished from single touch meta-navigation gestures and may provide additional or further functionality or be used to distinguish between types of information that the user may required to be displayed on the device which would also be dependent on the display area available for display. - In some example embodiments, an
optional force sensor 1322 or force sensors is disposed in any suitable location, for example, between the touch-sensitive display 1318 and a back of the portableelectronic device 100 to detect a force imparted by a touch on the touch-sensitive display 1318. Theforce sensor 1322 may be a force-sensitive resistor, strain gauge, piezoelectric or piezoresistive device, pressure sensor, or other suitable device. Force as utilized throughout the specification refers to force measurements, estimates, and/or calculations, such as pressure, deformation, stress, strain, force density, force-area relationships, thrust, torque, and other effects that include force or related quantities. - Force information related to a detected touch may be utilized to select information, such as information associated with a location of a touch. For example, a touch that does not meet a force threshold may highlight a selection option, whereas a touch that meets a force threshold may select or input that selection option. Different magnitudes of force may be associated with different functions or input. For example, a lesser force may result in displaying only time information in the user interface information, and a higher force may result in the display of e-mail information in the user interface information.
-
FIG. 14 is a front view of an example of a portableelectronic device 100. The portableelectronic device 100 includes ahousing 1402 that encloses components such as shown inFIG. 13 . Thehousing 1402 may include a back, sidewalls, and a front 1404 that frames the touch-sensitive display 1318. - In the example of
FIG. 14 , the touch-sensitive display 1318 is generally centered in thehousing 1402 such that adisplay area 1406 of the touch-sensitive overlay 1314 is generally centered with respect to thefront 1404 of thehousing 1402. Thenon-display area 1408 of the touch-sensitive overlay 1314 extends around thedisplay area 1406. In the presently described embodiment, the width of the non-display area is 4 mm. - For the purpose of the present example, the touch-
sensitive overlay 1314 extends to cover thedisplay area 1406 and thenon-display area 1408. Touches on thedisplay area 1406 may be detected and, for example, may be associated with displayed selectable features. Touches on thenon-display area 1408 may be detected, for example, to detect a meta-navigation gesture. Alternatively, meta-navigation gestures may be determined by both thenon-display area 1408 and thedisplay area 1406. The density of touch sensors may differ from thedisplay area 1406 to thenon-display area 1408. For example, the density of nodes in a mutual capacitive touch-sensitive display, or density of locations at which electrodes of one layer cross over electrodes of another layer, may differ between thedisplay area 1406 and thenon-display area 1408. - Gestures received on the touch-
sensitive display 1318 may be analyzed based on the attributes to discriminate between meta-navigation gestures and other touches, or non-meta navigation gestures. Meta-navigation gestures may be identified when the gesture crosses over a boundary near a periphery of thedisplay 112, such as aboundary 1410 between thedisplay area 1406 and thenon-display area 1408. In the example ofFIG. 14 , the origin point of a meta-navigation gesture may be determined utilizing the area of the touch-sensitive overlay 1314 that covers thenon-display area 1408. - A
buffer region 1412 or band that extends around theboundary 1410 between thedisplay area 1406 and thenon-display area 1408 may be utilized such that a meta-navigation gesture is identified when a touch has an origin point outside theboundary 1410 and thebuffer region 1412 and crosses through thebuffer region 1412 and over theboundary 1410 to a point inside theboundary 1410. Although illustrated inFIG. 14 , thebuffer region 1412 may not be visible. Instead, thebuffer region 1412 may be a region around theboundary 1410 that extends a width that is equivalent to a predetermined number of pixels, for example. Alternatively, theboundary 1410 may extend a predetermined number of touch sensors or may extend a predetermined distance from thedisplay area 1406. Theboundary 1410 may be a touch-sensitive region or may be a region in which touches are not detected. - Gestures that have an origin point in the
buffer region 1412, for example, may be identified as non-meta navigation gestures. Optionally, data from such gestures may be utilized by an application as a non-meta navigation gesture. Alternatively, data from such gestures may be discarded such that touches that have an origin point on thebuffer region 1412 are not utilized as input at the portableelectronic device 100. -
FIG. 15 illustrates examples of touches on the portable electronic device ofFIG. 15 .FIG. 15 illustrates examples of touches on the touch-sensitive display 1318. Thebuffer region 1412 is illustrated inFIG. 15 by hash markings for the purpose of explanation. As indicated, thebuffer region 1412 may not be visible to the user. For the purpose of explanation, touches are illustrated by circles at their points of origin. Arrows extending from the circles illustrate the paths of the touches that are gestures. - The
touch 1502 begins at the origin point outside theboundary 1410 and the outside thebuffer region 1412. The path of thetouch 1502 crosses thebuffer region 1412 and theboundary 1410 and is therefore identified as a meta-navigation gesture. Similarly, thetouches boundary 1410 and thebuffer region 1412 and their paths cross thebuffer region 1412 and theboundary 1410. Each of thetouches display area 1406. Thetouch contact touch contact 1512 that does not enter thedisplay area 1406 may not activate the display of user interface as not definable area would be visible. Howevertouch contact 1512 may be combined with another touch contact such astouch contact 1502 would define a displayable area for user interface information. - The
touch contact 1516 may also be defined as a region extending across the surface of the display such as provided by a capacitive elements provided within the case. The display area may be defined from the position of thetouch contact 1506 parallel to an edge of thedevice 1402 and may be defined in a landscape or portrait mode depending on the aspect ratios of thedevice 1402. Although the device is describe as having a touch-sensitive non-display area, the device may also not have a touch-sensitive non-display are in embodiments of the present disclosure. - Although capacitive display technologies are described in the above examples other types of display sensing technologies may be utilized to identify gesture or movement of the portable electronic device within the case.
- In some embodiments, any suitable computer readable memory can be used for storing instructions for performing the processes described herein. For example, in some embodiments, computer readable media can be transitory or non-transitory. For example, non-transitory computer readable media can include non-volatile computer storage memory or media such as magnetic media (such as hard disks), optical media (such as compact discs, digital video discs, Blu-ray discs, etc.), semiconductor media (such as flash memory, read only memory (ROM), Flash memory, electrically programmable read only memory (EPROM), electrically erasable programmable read only memory (EEPROM), etc.), any suitable media that is not fleeting or devoid of any semblance of permanence during transmission, and/or any suitable tangible media. As another example, transitory computer readable media can include signals on networks, in wires, conductors, optical fibers, circuits, and any suitable media that is fleeting and devoid of any semblance of permanence during transmission, and/or any suitable intangible media.
- Although the description discloses example methods and apparatus including, among other components, software executed on hardware, it should be noted that such methods and apparatus are merely illustrative and should not be considered as limiting. For example, it is contemplated that any or all of these hardware and software components could be embodied exclusively in hardware, exclusively in software, exclusively in firmware, or in any combination of hardware, software, and/or firmware. Accordingly, while the following describes example methods and apparatus, persons having ordinary skill in the art will readily appreciate that the examples provided are not the only way to implement such methods and apparatus.
Claims (27)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/404,308 US9778706B2 (en) | 2012-02-24 | 2012-02-24 | Peekable user interface on a portable electronic device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/404,308 US9778706B2 (en) | 2012-02-24 | 2012-02-24 | Peekable user interface on a portable electronic device |
Publications (2)
Publication Number | Publication Date |
---|---|
US20130222323A1 true US20130222323A1 (en) | 2013-08-29 |
US9778706B2 US9778706B2 (en) | 2017-10-03 |
Family
ID=49002319
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/404,308 Active 2032-05-16 US9778706B2 (en) | 2012-02-24 | 2012-02-24 | Peekable user interface on a portable electronic device |
Country Status (1)
Country | Link |
---|---|
US (1) | US9778706B2 (en) |
Cited By (232)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130106710A1 (en) * | 2011-10-31 | 2013-05-02 | Nokia Corporation | Methods, apparatuses, and computer program products for adjusting touchscreen sensitivity |
US20130328917A1 (en) * | 2012-06-08 | 2013-12-12 | Apple Inc. | Smart cover peek |
US20140043275A1 (en) * | 2012-03-02 | 2014-02-13 | Microsoft Corporation | Sensing User Input At Display Area Edge |
US20140244715A1 (en) * | 2013-02-25 | 2014-08-28 | Microsoft Corporation | Interaction between devices displaying application status information |
US8914752B1 (en) * | 2013-08-22 | 2014-12-16 | Snapchat, Inc. | Apparatus and method for accelerated display of ephemeral messages |
US20140368423A1 (en) * | 2013-06-17 | 2014-12-18 | Nvidia Corporation | Method and system for low power gesture recognition for waking up mobile devices |
US20150026623A1 (en) * | 2013-07-19 | 2015-01-22 | Apple Inc. | Device input modes with corresponding user interfaces |
JP2015018180A (en) * | 2013-07-12 | 2015-01-29 | ソニー株式会社 | Information processing apparatus and storage medium |
US20150067578A1 (en) * | 2013-09-04 | 2015-03-05 | Samsung Electronics Co., Ltd | Apparatus and method for executing function in electronic device |
US20150082255A1 (en) * | 2013-09-16 | 2015-03-19 | Motorola Mobility Llc | Methods and apparatus for displaying notification information |
US9083770B1 (en) | 2013-11-26 | 2015-07-14 | Snapchat, Inc. | Method and system for integrating real time communication features in applications |
US9094137B1 (en) | 2014-06-13 | 2015-07-28 | Snapchat, Inc. | Priority based placement of messages in a geo-location based event gallery |
US20150278529A1 (en) * | 2014-03-28 | 2015-10-01 | Samsung Electronics Co., Ltd. | Displaying method of electronic device and electronic device thereof |
EP2927795A1 (en) * | 2014-04-01 | 2015-10-07 | LG Electronics Inc. | Mobile terminal and control method thereof |
US9225897B1 (en) | 2014-07-07 | 2015-12-29 | Snapchat, Inc. | Apparatus and method for supplying content aware photo filters |
US9237202B1 (en) | 2014-03-07 | 2016-01-12 | Snapchat, Inc. | Content delivery network for ephemeral objects |
US20160018942A1 (en) * | 2014-07-15 | 2016-01-21 | Lg Electronics Inc. | Mobile terminal and control method thereof |
US9256089B2 (en) | 2012-06-15 | 2016-02-09 | Microsoft Technology Licensing, Llc | Object-detecting backlight unit |
US9268373B2 (en) | 2012-03-02 | 2016-02-23 | Microsoft Technology Licensing, Llc | Flexible hinge spine |
US9276886B1 (en) | 2014-05-09 | 2016-03-01 | Snapchat, Inc. | Apparatus and method for dynamically configuring application component tiles |
US9298236B2 (en) | 2012-03-02 | 2016-03-29 | Microsoft Technology Licensing, Llc | Multi-stage power adapter configured to provide a first power level upon initial connection of the power adapter to the host device and a second power level thereafter upon notification from the host device to the power adapter |
US9304549B2 (en) | 2013-03-28 | 2016-04-05 | Microsoft Technology Licensing, Llc | Hinge mechanism for rotatable component attachment |
US9348605B2 (en) | 2012-05-14 | 2016-05-24 | Microsoft Technology Licensing, Llc | System and method for accessory device architecture that passes human interface device (HID) data via intermediate processor |
US9354748B2 (en) | 2012-02-13 | 2016-05-31 | Microsoft Technology Licensing, Llc | Optical stylus interaction |
US20160154512A1 (en) * | 2013-01-16 | 2016-06-02 | Samsung Electronics Co., Ltd. | Mobile device and method for displaying information |
US9360893B2 (en) | 2012-03-02 | 2016-06-07 | Microsoft Technology Licensing, Llc | Input device writing surface |
US20160179247A1 (en) * | 2013-07-18 | 2016-06-23 | Fogale Nanotech | Guard accessory device for an electronic and/or computer apparatus, and apparatus equipped with such an accessory device |
US9385983B1 (en) | 2014-12-19 | 2016-07-05 | Snapchat, Inc. | Gallery of messages from individuals with a shared interest |
US9396354B1 (en) | 2014-05-28 | 2016-07-19 | Snapchat, Inc. | Apparatus and method for automated privacy protection in distributed images |
US9426905B2 (en) | 2012-03-02 | 2016-08-23 | Microsoft Technology Licensing, Llc | Connection device for computing devices |
EP3065031A1 (en) * | 2015-03-02 | 2016-09-07 | BlackBerry Limited | System and method of rendering data based on an angle of a carrying case flap |
US9474022B2 (en) | 2012-11-30 | 2016-10-18 | Nvidia Corporation | Saving power in a mobile terminal |
US20160334851A1 (en) * | 2013-08-28 | 2016-11-17 | Apple Inc. | Sensor for detecting presence of material |
US9537811B2 (en) | 2014-10-02 | 2017-01-03 | Snap Inc. | Ephemeral gallery of ephemeral messages |
US20170031591A1 (en) * | 2015-07-31 | 2017-02-02 | Samsung Electronics Co., Ltd. | Screen controlling method and electronic device for supporting the same |
US9575555B2 (en) | 2012-06-08 | 2017-02-21 | Apple Inc. | Peek mode and graphical user interface (GUI) experience |
US20170115693A1 (en) * | 2013-04-25 | 2017-04-27 | Yonggui Li | Frameless Tablet |
KR20170063886A (en) * | 2014-09-30 | 2017-06-08 | 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 | Displaying content on a display in power save mode |
US9705831B2 (en) | 2013-05-30 | 2017-07-11 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US9721394B2 (en) | 2012-08-22 | 2017-08-01 | Snaps Media, Inc. | Augmented reality virtual content platform apparatuses, methods and systems |
US9729685B2 (en) | 2011-09-28 | 2017-08-08 | Apple Inc. | Cover for a tablet device |
US9742713B2 (en) | 2013-05-30 | 2017-08-22 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US9760150B2 (en) | 2012-11-27 | 2017-09-12 | Nvidia Corporation | Low-power states for a computer system with integrated baseband |
EP3110022A4 (en) * | 2014-02-19 | 2017-10-11 | LG Electronics Inc. | Mobile terminal and method for controlling same |
US9793073B2 (en) | 2012-03-02 | 2017-10-17 | Microsoft Technology Licensing, Llc | Backlighting a fabric enclosure of a flexible cover |
US9823728B2 (en) | 2013-09-04 | 2017-11-21 | Nvidia Corporation | Method and system for reduced rate touch scanning on an electronic device |
US9843720B1 (en) | 2014-11-12 | 2017-12-12 | Snap Inc. | User interface for accessing media at a geographic location |
CN107491257A (en) * | 2016-06-12 | 2017-12-19 | 苹果公司 | For accessing the apparatus and method of common equipment function |
US9854219B2 (en) | 2014-12-19 | 2017-12-26 | Snap Inc. | Gallery of videos set to an audio time line |
US9866999B1 (en) | 2014-01-12 | 2018-01-09 | Investment Asset Holdings Llc | Location-based messaging |
US9870066B2 (en) | 2012-03-02 | 2018-01-16 | Microsoft Technology Licensing, Llc | Method of manufacturing an input device |
US9881094B2 (en) | 2015-05-05 | 2018-01-30 | Snap Inc. | Systems and methods for automated local story generation and curation |
US9881592B2 (en) | 2013-10-08 | 2018-01-30 | Nvidia Corporation | Hardware overlay assignment |
US9882907B1 (en) | 2012-11-08 | 2018-01-30 | Snap Inc. | Apparatus and method for single action control of social network profile access |
US9936030B2 (en) | 2014-01-03 | 2018-04-03 | Investel Capital Corporation | User content sharing system and method with location-based external content integration |
US9961433B2 (en) | 2015-09-30 | 2018-05-01 | Apple Inc. | Case with inductive charging system to charge a portable device |
US10031556B2 (en) | 2012-06-08 | 2018-07-24 | Microsoft Technology Licensing, Llc | User experience adaptation |
US10042491B2 (en) | 2013-11-19 | 2018-08-07 | Quickstep Technologies Llc | Cover accessory device for a portable electronic and/or computer apparatus, and apparatus provided with such an accessory device |
US20180225030A1 (en) * | 2017-01-17 | 2018-08-09 | Nanoport Technology Inc. | Electronic device having force-based modifiable graphical elements and method of operating same |
US10055717B1 (en) | 2014-08-22 | 2018-08-21 | Snap Inc. | Message processor with application prompts |
US10082926B1 (en) | 2014-02-21 | 2018-09-25 | Snap Inc. | Apparatus and method for alternate channel communication initiated through a common message thread |
US10102680B2 (en) | 2015-10-30 | 2018-10-16 | Snap Inc. | Image based tracking in augmented reality systems |
US10123166B2 (en) | 2015-01-26 | 2018-11-06 | Snap Inc. | Content request by location |
US10120563B2 (en) * | 2014-12-24 | 2018-11-06 | Intel Corporation | User interface for liquid container |
US10135949B1 (en) | 2015-05-05 | 2018-11-20 | Snap Inc. | Systems and methods for story and sub-story navigation |
US10133705B1 (en) | 2015-01-19 | 2018-11-20 | Snap Inc. | Multichannel system |
US10157449B1 (en) | 2015-01-09 | 2018-12-18 | Snap Inc. | Geo-location-based image filters |
US10165402B1 (en) | 2016-06-28 | 2018-12-25 | Snap Inc. | System to track engagement of media items |
US10203855B2 (en) | 2016-12-09 | 2019-02-12 | Snap Inc. | Customized user-controlled media overlays |
US20190050045A1 (en) * | 2017-08-14 | 2019-02-14 | Samsung Electronics Co., Ltd. | Method for displaying content and electronic device thereof |
US10219111B1 (en) | 2018-04-18 | 2019-02-26 | Snap Inc. | Visitation tracking system |
US10223397B1 (en) | 2015-03-13 | 2019-03-05 | Snap Inc. | Social graph based co-location of network users |
US10284508B1 (en) | 2014-10-02 | 2019-05-07 | Snap Inc. | Ephemeral gallery of ephemeral messages with opt-in permanence |
US10311916B2 (en) | 2014-12-19 | 2019-06-04 | Snap Inc. | Gallery of videos set to an audio time line |
US10319149B1 (en) | 2017-02-17 | 2019-06-11 | Snap Inc. | Augmented reality anamorphosis system |
US10325567B2 (en) * | 2016-11-01 | 2019-06-18 | Hyundai Motor Company | Vehicle and method for controlling the same |
CN109901770A (en) * | 2014-12-30 | 2019-06-18 | 华为终端有限公司 | A kind of display methods and mobile terminal of graphic user interface |
US10327096B1 (en) | 2018-03-06 | 2019-06-18 | Snap Inc. | Geo-fence selection system |
US10334307B2 (en) | 2011-07-12 | 2019-06-25 | Snap Inc. | Methods and systems of providing visual content editing functions |
US10348662B2 (en) | 2016-07-19 | 2019-07-09 | Snap Inc. | Generating customized electronic messaging graphics |
US10354425B2 (en) | 2015-12-18 | 2019-07-16 | Snap Inc. | Method and system for providing context relevant media augmentation |
US10387514B1 (en) | 2016-06-30 | 2019-08-20 | Snap Inc. | Automated content curation and communication |
US10387730B1 (en) | 2017-04-20 | 2019-08-20 | Snap Inc. | Augmented reality typography personalization system |
US10423983B2 (en) | 2014-09-16 | 2019-09-24 | Snap Inc. | Determining targeting information based on a predictive targeting model |
US10430838B1 (en) | 2016-06-28 | 2019-10-01 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections with automated advertising |
US10439972B1 (en) | 2013-05-30 | 2019-10-08 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
EP3151088B1 (en) * | 2015-10-02 | 2019-11-06 | BlackBerry Limited | Method and apparatus for movable assembly position sensing and virtual keyboard display |
US10474321B2 (en) | 2015-11-30 | 2019-11-12 | Snap Inc. | Network resource location linking and visual content sharing |
US10499191B1 (en) | 2017-10-09 | 2019-12-03 | Snap Inc. | Context sensitive presentation of content |
US10523625B1 (en) | 2017-03-09 | 2019-12-31 | Snap Inc. | Restricted group content collection |
US10521579B2 (en) | 2017-09-09 | 2019-12-31 | Apple Inc. | Implementation of biometric authentication |
US10581782B2 (en) | 2017-03-27 | 2020-03-03 | Snap Inc. | Generating a stitched data stream |
US10582277B2 (en) | 2017-03-27 | 2020-03-03 | Snap Inc. | Generating a stitched data stream |
US10614828B1 (en) | 2017-02-20 | 2020-04-07 | Snap Inc. | Augmented reality speech balloon system |
US10616239B2 (en) | 2015-03-18 | 2020-04-07 | Snap Inc. | Geo-fence authorization provisioning |
US10623666B2 (en) | 2016-11-07 | 2020-04-14 | Snap Inc. | Selective identification and order of image modifiers |
US10638256B1 (en) | 2016-06-20 | 2020-04-28 | Pipbin, Inc. | System for distribution and display of mobile targeted augmented reality content |
US10657708B1 (en) | 2015-11-30 | 2020-05-19 | Snap Inc. | Image and point cloud based tracking and in augmented reality systems |
CN111192547A (en) * | 2018-11-14 | 2020-05-22 | 乐金显示有限公司 | Foldable display and driving method thereof |
US10679389B2 (en) | 2016-02-26 | 2020-06-09 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections |
US10678818B2 (en) | 2018-01-03 | 2020-06-09 | Snap Inc. | Tag distribution visualization system |
US10679393B2 (en) | 2018-07-24 | 2020-06-09 | Snap Inc. | Conditional modification of augmented reality object |
US10698598B2 (en) | 2015-08-10 | 2020-06-30 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10705718B2 (en) | 2015-06-07 | 2020-07-07 | Apple Inc. | Devices and methods for navigating between user interfaces |
US10712934B2 (en) | 2016-06-12 | 2020-07-14 | Apple Inc. | Devices and methods for accessing prevalent device functions |
US10740974B1 (en) | 2017-09-15 | 2020-08-11 | Snap Inc. | Augmented reality system |
US10749967B2 (en) | 2016-05-19 | 2020-08-18 | Apple Inc. | User interface for remote authorization |
US10748153B2 (en) | 2014-05-29 | 2020-08-18 | Apple Inc. | User interface for payments |
US10775999B2 (en) | 2012-05-09 | 2020-09-15 | Apple Inc. | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
US10775994B2 (en) | 2012-05-09 | 2020-09-15 | Apple Inc. | Device, method, and graphical user interface for moving and dropping a user interface object |
US10783576B1 (en) | 2019-03-24 | 2020-09-22 | Apple Inc. | User interfaces for managing an account |
US10805696B1 (en) | 2016-06-20 | 2020-10-13 | Pipbin, Inc. | System for recording and targeting tagged content of user interest |
US10803281B2 (en) | 2013-09-09 | 2020-10-13 | Apple Inc. | Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs |
US10817898B2 (en) | 2015-08-13 | 2020-10-27 | Placed, Llc | Determining exposures to content presented by physical objects |
US10824654B2 (en) | 2014-09-18 | 2020-11-03 | Snap Inc. | Geolocation-based pictographs |
US10834525B2 (en) | 2016-02-26 | 2020-11-10 | Snap Inc. | Generation, curation, and presentation of media collections |
US10841484B2 (en) | 2015-06-07 | 2020-11-17 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US10839219B1 (en) | 2016-06-20 | 2020-11-17 | Pipbin, Inc. | System for curation, distribution and display of location-dependent augmented reality content |
US10862951B1 (en) | 2007-01-05 | 2020-12-08 | Snap Inc. | Real-time display of multiple images |
US10860177B2 (en) | 2015-03-08 | 2020-12-08 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10860096B2 (en) | 2018-09-28 | 2020-12-08 | Apple Inc. | Device control using gaze information |
US10872256B2 (en) | 2017-09-09 | 2020-12-22 | Apple Inc. | Implementation of biometric authentication |
US10884591B2 (en) | 2012-05-09 | 2021-01-05 | Apple Inc. | Device, method, and graphical user interface for selecting object within a group of objects |
US10885136B1 (en) | 2018-02-28 | 2021-01-05 | Snap Inc. | Audience filtering system |
US10884608B2 (en) | 2015-08-10 | 2021-01-05 | Apple Inc. | Devices, methods, and graphical user interfaces for content navigation and manipulation |
US10908808B2 (en) | 2012-05-09 | 2021-02-02 | Apple Inc. | Device, method, and graphical user interface for displaying additional information in response to a user contact |
US10915911B2 (en) | 2017-02-03 | 2021-02-09 | Snap Inc. | System to determine a price-schedule to distribute media content |
US10915243B2 (en) | 2012-12-29 | 2021-02-09 | Apple Inc. | Device, method, and graphical user interface for adjusting content selection |
US10933311B2 (en) | 2018-03-14 | 2021-03-02 | Snap Inc. | Generating collectible items based on location information |
US10942570B2 (en) | 2012-05-09 | 2021-03-09 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface |
US10952013B1 (en) | 2017-04-27 | 2021-03-16 | Snap Inc. | Selective location-based identity communication |
US10948717B1 (en) | 2015-03-23 | 2021-03-16 | Snap Inc. | Reducing boot time and power consumption in wearable display systems |
US10956550B2 (en) | 2007-09-24 | 2021-03-23 | Apple Inc. | Embedded authentication systems in an electronic device |
US10963529B1 (en) | 2017-04-27 | 2021-03-30 | Snap Inc. | Location-based search mechanism in a graphical user interface |
US10969945B2 (en) | 2012-05-09 | 2021-04-06 | Apple Inc. | Device, method, and graphical user interface for selecting user interface objects |
US10979752B1 (en) | 2018-02-28 | 2021-04-13 | Snap Inc. | Generating media content items based on location information |
US10993069B2 (en) | 2015-07-16 | 2021-04-27 | Snap Inc. | Dynamically adaptive media content delivery |
US10996788B2 (en) | 2012-05-09 | 2021-05-04 | Apple Inc. | Device, method, and graphical user interface for transitioning between display states in response to a gesture |
US10997760B2 (en) | 2018-08-31 | 2021-05-04 | Snap Inc. | Augmented reality anthropomorphization system |
US11010027B2 (en) | 2012-05-09 | 2021-05-18 | Apple Inc. | Device, method, and graphical user interface for manipulating framed graphical objects |
US11017173B1 (en) | 2017-12-22 | 2021-05-25 | Snap Inc. | Named entity recognition visual context and caption data |
US11023116B2 (en) | 2012-05-09 | 2021-06-01 | Apple Inc. | Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input |
US11023514B2 (en) | 2016-02-26 | 2021-06-01 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections |
US11030787B2 (en) | 2017-10-30 | 2021-06-08 | Snap Inc. | Mobile-based cartographic control of display content |
US11037150B2 (en) | 2016-06-12 | 2021-06-15 | Apple Inc. | User interfaces for transactions |
US11037372B2 (en) | 2017-03-06 | 2021-06-15 | Snap Inc. | Virtual vision system |
US11044393B1 (en) | 2016-06-20 | 2021-06-22 | Pipbin, Inc. | System for curation and display of location-dependent augmented reality content in an augmented estate system |
US11054990B2 (en) | 2015-03-19 | 2021-07-06 | Apple Inc. | Touch input cursor manipulation |
US11074572B2 (en) | 2016-09-06 | 2021-07-27 | Apple Inc. | User interfaces for stored-value accounts |
US11100349B2 (en) | 2018-09-28 | 2021-08-24 | Apple Inc. | Audio assisted enrollment |
US11112957B2 (en) | 2015-03-08 | 2021-09-07 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
US11128715B1 (en) | 2019-12-30 | 2021-09-21 | Snap Inc. | Physical friend proximity in chat |
US11163941B1 (en) | 2018-03-30 | 2021-11-02 | Snap Inc. | Annotating a collection of media content items |
US11170085B2 (en) | 2018-06-03 | 2021-11-09 | Apple Inc. | Implementation of biometric authentication |
US11172101B1 (en) | 2018-09-20 | 2021-11-09 | Apple Inc. | Multifunction accessory case |
US11170393B1 (en) | 2017-04-11 | 2021-11-09 | Snap Inc. | System to calculate an engagement score of location based media content |
US11182017B2 (en) | 2015-08-10 | 2021-11-23 | Apple Inc. | Devices and methods for processing touch inputs based on their intensities |
US11182383B1 (en) | 2012-02-24 | 2021-11-23 | Placed, Llc | System and method for data collection to validate location data |
US11200309B2 (en) | 2011-09-29 | 2021-12-14 | Apple Inc. | Authentication with secondary approver |
US11199957B1 (en) | 2018-11-30 | 2021-12-14 | Snap Inc. | Generating customized avatars based on location information |
US11201981B1 (en) | 2016-06-20 | 2021-12-14 | Pipbin, Inc. | System for notification of user accessibility of curated location-dependent content in an augmented estate |
US11206615B2 (en) | 2019-05-30 | 2021-12-21 | Snap Inc. | Wearable device location systems |
US11216869B2 (en) | 2014-09-23 | 2022-01-04 | Snap Inc. | User interface to augment an image using geolocation |
US11218838B2 (en) | 2019-10-31 | 2022-01-04 | Snap Inc. | Focused map-based context information surfacing |
US11228551B1 (en) | 2020-02-12 | 2022-01-18 | Snap Inc. | Multiple gateway message exchange |
US11232040B1 (en) | 2017-04-28 | 2022-01-25 | Snap Inc. | Precaching unlockable data elements |
US11231831B2 (en) | 2015-06-07 | 2022-01-25 | Apple Inc. | Devices and methods for content preview based on touch input intensity |
US11240424B2 (en) | 2015-06-07 | 2022-02-01 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US11249614B2 (en) | 2019-03-28 | 2022-02-15 | Snap Inc. | Generating personalized map interface with enhanced icons |
US11250075B1 (en) | 2017-02-17 | 2022-02-15 | Snap Inc. | Searching social media content |
US11265273B1 (en) | 2017-12-01 | 2022-03-01 | Snap, Inc. | Dynamic media overlay with smart widget |
USRE48963E1 (en) | 2012-03-02 | 2022-03-08 | Microsoft Technology Licensing, Llc | Connection device for computing devices |
US11290851B2 (en) | 2020-06-15 | 2022-03-29 | Snap Inc. | Location sharing using offline and online objects |
US11294936B1 (en) | 2019-01-30 | 2022-04-05 | Snap Inc. | Adaptive spatial density based clustering |
US11301117B2 (en) | 2019-03-08 | 2022-04-12 | Snap Inc. | Contextual information in chat |
US11314776B2 (en) | 2020-06-15 | 2022-04-26 | Snap Inc. | Location sharing using friend list versions |
US11314407B2 (en) | 2012-05-09 | 2022-04-26 | Apple Inc. | Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object |
US11321731B2 (en) | 2015-06-05 | 2022-05-03 | Apple Inc. | User interface for loyalty accounts and private label accounts |
US11343323B2 (en) | 2019-12-31 | 2022-05-24 | Snap Inc. | Augmented reality objects registry |
US11361493B2 (en) | 2019-04-01 | 2022-06-14 | Snap Inc. | Semantic texture mapping system |
US11388226B1 (en) | 2015-01-13 | 2022-07-12 | Snap Inc. | Guided personal identity based actions |
US11416042B2 (en) * | 2011-11-22 | 2022-08-16 | Samsung Electronics Co., Ltd. | Flexible display apparatus and method of providing user interface by using the same |
US11429618B2 (en) | 2019-12-30 | 2022-08-30 | Snap Inc. | Surfacing augmented reality objects |
US11430091B2 (en) | 2020-03-27 | 2022-08-30 | Snap Inc. | Location mapping for large scale augmented-reality |
US11455082B2 (en) | 2018-09-28 | 2022-09-27 | Snap Inc. | Collaborative achievement interface |
US11475254B1 (en) | 2017-09-08 | 2022-10-18 | Snap Inc. | Multimodal entity identification |
US11481769B2 (en) | 2016-06-11 | 2022-10-25 | Apple Inc. | User interface for transactions |
US11483267B2 (en) | 2020-06-15 | 2022-10-25 | Snap Inc. | Location sharing using different rate-limited links |
US11500525B2 (en) | 2019-02-25 | 2022-11-15 | Snap Inc. | Custom media overlay system |
US11503432B2 (en) | 2020-06-15 | 2022-11-15 | Snap Inc. | Scalable real-time location sharing framework |
US11507614B1 (en) | 2018-02-13 | 2022-11-22 | Snap Inc. | Icon based tagging |
US11516167B2 (en) | 2020-03-05 | 2022-11-29 | Snap Inc. | Storing data based on device location |
US11558709B2 (en) | 2018-11-30 | 2023-01-17 | Snap Inc. | Position service to determine relative position to map features |
US11574431B2 (en) | 2019-02-26 | 2023-02-07 | Snap Inc. | Avatar based on weather |
US11574041B2 (en) | 2016-10-25 | 2023-02-07 | Apple Inc. | User interface for managing access to credentials for use in an operation |
US11601888B2 (en) | 2021-03-29 | 2023-03-07 | Snap Inc. | Determining location using multi-source geolocation data |
US11601783B2 (en) | 2019-06-07 | 2023-03-07 | Snap Inc. | Detection of a physical collision between two client devices in a location sharing system |
US11606755B2 (en) | 2019-05-30 | 2023-03-14 | Snap Inc. | Wearable device location systems architecture |
US11616745B2 (en) | 2017-01-09 | 2023-03-28 | Snap Inc. | Contextual generation and selection of customized media content |
US11619501B2 (en) | 2020-03-11 | 2023-04-04 | Snap Inc. | Avatar based on trip |
US11625443B2 (en) | 2014-06-05 | 2023-04-11 | Snap Inc. | Web document enhancement |
US11631276B2 (en) | 2016-03-31 | 2023-04-18 | Snap Inc. | Automated avatar generation |
US11645324B2 (en) | 2021-03-31 | 2023-05-09 | Snap Inc. | Location-based timeline media content system |
US11675831B2 (en) | 2017-05-31 | 2023-06-13 | Snap Inc. | Geolocation based playlists |
US11676378B2 (en) | 2020-06-29 | 2023-06-13 | Snap Inc. | Providing travel-based augmented reality content with a captured image |
US11676373B2 (en) | 2008-01-03 | 2023-06-13 | Apple Inc. | Personal computing device control using face detection and recognition |
US11714535B2 (en) | 2019-07-11 | 2023-08-01 | Snap Inc. | Edge gesture interface with smart interactions |
US11729343B2 (en) | 2019-12-30 | 2023-08-15 | Snap Inc. | Including video feed in message thread |
US11734712B2 (en) | 2012-02-24 | 2023-08-22 | Foursquare Labs, Inc. | Attributing in-store visits to media consumption based on data collected from user devices |
US11751015B2 (en) | 2019-01-16 | 2023-09-05 | Snap Inc. | Location-based context information sharing in a messaging system |
US11776256B2 (en) | 2020-03-27 | 2023-10-03 | Snap Inc. | Shared augmented reality system |
US11785161B1 (en) | 2016-06-20 | 2023-10-10 | Pipbin, Inc. | System for user accessibility of tagged curated augmented reality content |
US11783305B2 (en) | 2015-06-05 | 2023-10-10 | Apple Inc. | User interface for loyalty accounts and private label accounts for a wearable device |
US11799811B2 (en) | 2018-10-31 | 2023-10-24 | Snap Inc. | Messaging and gaming applications communication platform |
US11809624B2 (en) | 2019-02-13 | 2023-11-07 | Snap Inc. | Sleep detection in a location sharing system |
US11816853B2 (en) | 2016-08-30 | 2023-11-14 | Snap Inc. | Systems and methods for simultaneous localization and mapping |
US11816194B2 (en) | 2020-06-21 | 2023-11-14 | Apple Inc. | User interfaces for managing secure operations |
US11821742B2 (en) | 2019-09-26 | 2023-11-21 | Snap Inc. | Travel based notifications |
US11829834B2 (en) | 2021-10-29 | 2023-11-28 | Snap Inc. | Extended QR code |
US11843456B2 (en) | 2016-10-24 | 2023-12-12 | Snap Inc. | Generating and displaying customized avatars in media overlays |
US11842411B2 (en) | 2017-04-27 | 2023-12-12 | Snap Inc. | Location-based virtual avatars |
US11852554B1 (en) | 2019-03-21 | 2023-12-26 | Snap Inc. | Barometer calibration in a location sharing system |
US11860888B2 (en) | 2018-05-22 | 2024-01-02 | Snap Inc. | Event detection system |
US11868414B1 (en) | 2019-03-14 | 2024-01-09 | Snap Inc. | Graph-based prediction for contact suggestion in a location sharing system |
US11870743B1 (en) | 2017-01-23 | 2024-01-09 | Snap Inc. | Customized digital avatar accessories |
US11876941B1 (en) | 2016-06-20 | 2024-01-16 | Pipbin, Inc. | Clickable augmented reality content manager, system, and network |
US11877211B2 (en) | 2019-01-14 | 2024-01-16 | Snap Inc. | Destination sharing in location sharing system |
US11893208B2 (en) | 2019-12-31 | 2024-02-06 | Snap Inc. | Combined map icon with action indicator |
US11900418B2 (en) | 2016-04-04 | 2024-02-13 | Snap Inc. | Mutable geo-fencing system |
US11925869B2 (en) | 2012-05-08 | 2024-03-12 | Snap Inc. | System and method for generating and displaying avatars |
US11943192B2 (en) | 2020-08-31 | 2024-03-26 | Snap Inc. | Co-location connection service |
US11954314B2 (en) | 2022-09-09 | 2024-04-09 | Snap Inc. | Custom media overlay system |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU2011213563B2 (en) | 2010-02-08 | 2015-12-24 | Ionis Pharmaceuticals, Inc. | Selective reduction of allelic variants |
DK2742136T3 (en) | 2011-08-11 | 2017-11-20 | Ionis Pharmaceuticals Inc | GAPMER COMPOUNDS INCLUDING 5 'MODIFIED DEOXYRIBONUCLEOSIDES IN GAP AND APPLICATIONS THEREOF |
EP2852606B1 (en) | 2012-05-22 | 2019-08-07 | Ionis Pharmaceuticals, Inc. | Modulation of enhancer rna mediated gene expression |
RU2748495C2 (en) | 2012-05-24 | 2021-05-26 | Ионис Фармасьютикалз, Инк. | Methods and compositions for modulating expression of apolipoprotein (a) |
AR092982A1 (en) | 2012-10-11 | 2015-05-13 | Isis Pharmaceuticals Inc | MODULATION OF THE EXPRESSION OF ANDROGEN RECEIVERS |
ES2807379T3 (en) | 2013-03-14 | 2021-02-22 | Ionis Pharmaceuticals Inc | Compositions and methods to regulate the expression of Tau |
EP3137476B1 (en) | 2014-04-28 | 2019-10-09 | Ionis Pharmaceuticals, Inc. | Linkage modified oligomeric compounds |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070252821A1 (en) * | 2004-06-17 | 2007-11-01 | Koninklijke Philips Electronics, N.V. | Use of a Two Finger Input on Touch Screens |
US20080261664A1 (en) * | 2007-04-20 | 2008-10-23 | Fun Friends, Inc. | Cover for electronic device |
US20090058822A1 (en) * | 2007-09-04 | 2009-03-05 | Apple Inc. | Video Chapter Access and License Renewal |
US20100185989A1 (en) * | 2008-05-06 | 2010-07-22 | Palm, Inc. | User Interface For Initiating Activities In An Electronic Device |
US20100285881A1 (en) * | 2009-05-07 | 2010-11-11 | Microsoft Corporation | Touch gesturing on multi-player game space |
US20110205163A1 (en) * | 2010-02-19 | 2011-08-25 | Microsoft Corporation | Off-Screen Gestures to Create On-Screen Input |
US20120068798A1 (en) * | 2010-09-17 | 2012-03-22 | Apple Inc. | Accessory device with magnetic attachment |
US20120072167A1 (en) * | 2010-09-17 | 2012-03-22 | Apple Inc. | Sensor fusion |
US20120113572A1 (en) * | 2010-06-07 | 2012-05-10 | 360 Mobility Solutions, Llc | Method and system for electronic device cases |
US20120304114A1 (en) * | 2011-05-27 | 2012-11-29 | Tsz Yan Wong | Managing an immersive interface in a multi-application immersive environment |
US20130061170A1 (en) * | 2011-09-01 | 2013-03-07 | Sony Corporation | User interface element |
US20130076614A1 (en) * | 2011-09-28 | 2013-03-28 | Apple Inc. | Accessory device |
US20130106710A1 (en) * | 2011-10-31 | 2013-05-02 | Nokia Corporation | Methods, apparatuses, and computer program products for adjusting touchscreen sensitivity |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5712995A (en) | 1995-09-20 | 1998-01-27 | Galileo Frames, Inc. | Non-overlapping tiling apparatus and method for multiple window displays |
US7657849B2 (en) | 2005-12-23 | 2010-02-02 | Apple Inc. | Unlocking a device by performing gestures on an unlock image |
US8149000B2 (en) | 2007-08-31 | 2012-04-03 | Standard Microsystems Corporation | Detecting closure of an electronic device using capacitive sensors |
TW200939081A (en) | 2008-03-04 | 2009-09-16 | Mobinnova Corp | Method and apparatus for displaying keyboard icon of portable electronic device |
MX2011003069A (en) | 2008-09-24 | 2011-04-19 | Koninkl Philips Electronics Nv | A user interface for a multi-point touch sensitive device. |
US20100107067A1 (en) | 2008-10-27 | 2010-04-29 | Nokia Corporation | Input on touch based user interfaces |
KR101613086B1 (en) | 2009-01-05 | 2016-04-29 | 삼성전자주식회사 | Apparatus and method for display of electronic device |
KR101521932B1 (en) | 2009-01-19 | 2015-05-20 | 엘지전자 주식회사 | Terminal and method for controlling the same |
US8773470B2 (en) | 2010-05-07 | 2014-07-08 | Apple Inc. | Systems and methods for displaying visual information on a device |
US8390412B2 (en) | 2010-09-17 | 2013-03-05 | Apple Inc. | Protective cover |
-
2012
- 2012-02-24 US US13/404,308 patent/US9778706B2/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070252821A1 (en) * | 2004-06-17 | 2007-11-01 | Koninklijke Philips Electronics, N.V. | Use of a Two Finger Input on Touch Screens |
US20080261664A1 (en) * | 2007-04-20 | 2008-10-23 | Fun Friends, Inc. | Cover for electronic device |
US20090058822A1 (en) * | 2007-09-04 | 2009-03-05 | Apple Inc. | Video Chapter Access and License Renewal |
US20100185989A1 (en) * | 2008-05-06 | 2010-07-22 | Palm, Inc. | User Interface For Initiating Activities In An Electronic Device |
US20100285881A1 (en) * | 2009-05-07 | 2010-11-11 | Microsoft Corporation | Touch gesturing on multi-player game space |
US20110205163A1 (en) * | 2010-02-19 | 2011-08-25 | Microsoft Corporation | Off-Screen Gestures to Create On-Screen Input |
US20120113572A1 (en) * | 2010-06-07 | 2012-05-10 | 360 Mobility Solutions, Llc | Method and system for electronic device cases |
US20120068798A1 (en) * | 2010-09-17 | 2012-03-22 | Apple Inc. | Accessory device with magnetic attachment |
US20120072167A1 (en) * | 2010-09-17 | 2012-03-22 | Apple Inc. | Sensor fusion |
US20120304114A1 (en) * | 2011-05-27 | 2012-11-29 | Tsz Yan Wong | Managing an immersive interface in a multi-application immersive environment |
US20130061170A1 (en) * | 2011-09-01 | 2013-03-07 | Sony Corporation | User interface element |
US20130076614A1 (en) * | 2011-09-28 | 2013-03-28 | Apple Inc. | Accessory device |
US20130106710A1 (en) * | 2011-10-31 | 2013-05-02 | Nokia Corporation | Methods, apparatuses, and computer program products for adjusting touchscreen sensitivity |
Cited By (518)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11588770B2 (en) | 2007-01-05 | 2023-02-21 | Snap Inc. | Real-time display of multiple images |
US10862951B1 (en) | 2007-01-05 | 2020-12-08 | Snap Inc. | Real-time display of multiple images |
US10956550B2 (en) | 2007-09-24 | 2021-03-23 | Apple Inc. | Embedded authentication systems in an electronic device |
US11468155B2 (en) | 2007-09-24 | 2022-10-11 | Apple Inc. | Embedded authentication systems in an electronic device |
US11676373B2 (en) | 2008-01-03 | 2023-06-13 | Apple Inc. | Personal computing device control using face detection and recognition |
US11750875B2 (en) | 2011-07-12 | 2023-09-05 | Snap Inc. | Providing visual content editing functions |
US10999623B2 (en) | 2011-07-12 | 2021-05-04 | Snap Inc. | Providing visual content editing functions |
US11451856B2 (en) | 2011-07-12 | 2022-09-20 | Snap Inc. | Providing visual content editing functions |
US10334307B2 (en) | 2011-07-12 | 2019-06-25 | Snap Inc. | Methods and systems of providing visual content editing functions |
US9729685B2 (en) | 2011-09-28 | 2017-08-08 | Apple Inc. | Cover for a tablet device |
US11755712B2 (en) | 2011-09-29 | 2023-09-12 | Apple Inc. | Authentication with secondary approver |
US11200309B2 (en) | 2011-09-29 | 2021-12-14 | Apple Inc. | Authentication with secondary approver |
US20130106710A1 (en) * | 2011-10-31 | 2013-05-02 | Nokia Corporation | Methods, apparatuses, and computer program products for adjusting touchscreen sensitivity |
US11416042B2 (en) * | 2011-11-22 | 2022-08-16 | Samsung Electronics Co., Ltd. | Flexible display apparatus and method of providing user interface by using the same |
US9354748B2 (en) | 2012-02-13 | 2016-05-31 | Microsoft Technology Licensing, Llc | Optical stylus interaction |
US11182383B1 (en) | 2012-02-24 | 2021-11-23 | Placed, Llc | System and method for data collection to validate location data |
US11734712B2 (en) | 2012-02-24 | 2023-08-22 | Foursquare Labs, Inc. | Attributing in-store visits to media consumption based on data collected from user devices |
US9710093B2 (en) | 2012-03-02 | 2017-07-18 | Microsoft Technology Licensing, Llc | Pressure sensitive key normalization |
US20140043275A1 (en) * | 2012-03-02 | 2014-02-13 | Microsoft Corporation | Sensing User Input At Display Area Edge |
US9268373B2 (en) | 2012-03-02 | 2016-02-23 | Microsoft Technology Licensing, Llc | Flexible hinge spine |
US9275809B2 (en) | 2012-03-02 | 2016-03-01 | Microsoft Technology Licensing, Llc | Device camera angle |
US10963087B2 (en) | 2012-03-02 | 2021-03-30 | Microsoft Technology Licensing, Llc | Pressure sensitive keys |
US9298236B2 (en) | 2012-03-02 | 2016-03-29 | Microsoft Technology Licensing, Llc | Multi-stage power adapter configured to provide a first power level upon initial connection of the power adapter to the host device and a second power level thereafter upon notification from the host device to the power adapter |
US9304948B2 (en) | 2012-03-02 | 2016-04-05 | Microsoft Technology Licensing, Llc | Sensing user input at display area edge |
US9904327B2 (en) | 2012-03-02 | 2018-02-27 | Microsoft Technology Licensing, Llc | Flexible hinge and removable attachment |
US9304949B2 (en) * | 2012-03-02 | 2016-04-05 | Microsoft Technology Licensing, Llc | Sensing user input at display area edge |
US9852855B2 (en) | 2012-03-02 | 2017-12-26 | Microsoft Technology Licensing, Llc | Pressure sensitive key normalization |
US9619071B2 (en) | 2012-03-02 | 2017-04-11 | Microsoft Technology Licensing, Llc | Computing device and an apparatus having sensors configured for measuring spatial information indicative of a position of the computing devices |
US9946307B2 (en) | 2012-03-02 | 2018-04-17 | Microsoft Technology Licensing, Llc | Classifying the intent of user input |
US9360893B2 (en) | 2012-03-02 | 2016-06-07 | Microsoft Technology Licensing, Llc | Input device writing surface |
US9618977B2 (en) | 2012-03-02 | 2017-04-11 | Microsoft Technology Licensing, Llc | Input device securing techniques |
US9793073B2 (en) | 2012-03-02 | 2017-10-17 | Microsoft Technology Licensing, Llc | Backlighting a fabric enclosure of a flexible cover |
US9766663B2 (en) | 2012-03-02 | 2017-09-19 | Microsoft Technology Licensing, Llc | Hinge for component attachment |
US9870066B2 (en) | 2012-03-02 | 2018-01-16 | Microsoft Technology Licensing, Llc | Method of manufacturing an input device |
US10013030B2 (en) | 2012-03-02 | 2018-07-03 | Microsoft Technology Licensing, Llc | Multiple position input device cover |
US9411751B2 (en) | 2012-03-02 | 2016-08-09 | Microsoft Technology Licensing, Llc | Key formation |
US9426905B2 (en) | 2012-03-02 | 2016-08-23 | Microsoft Technology Licensing, Llc | Connection device for computing devices |
US9678542B2 (en) | 2012-03-02 | 2017-06-13 | Microsoft Technology Licensing, Llc | Multiple position input device cover |
USRE48963E1 (en) | 2012-03-02 | 2022-03-08 | Microsoft Technology Licensing, Llc | Connection device for computing devices |
US9460029B2 (en) | 2012-03-02 | 2016-10-04 | Microsoft Technology Licensing, Llc | Pressure sensitive keys |
US9465412B2 (en) | 2012-03-02 | 2016-10-11 | Microsoft Technology Licensing, Llc | Input device layers and nesting |
US11925869B2 (en) | 2012-05-08 | 2024-03-12 | Snap Inc. | System and method for generating and displaying avatars |
US10884591B2 (en) | 2012-05-09 | 2021-01-05 | Apple Inc. | Device, method, and graphical user interface for selecting object within a group of objects |
US10942570B2 (en) | 2012-05-09 | 2021-03-09 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface |
US11947724B2 (en) | 2012-05-09 | 2024-04-02 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface |
US11221675B2 (en) | 2012-05-09 | 2022-01-11 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface |
US10908808B2 (en) | 2012-05-09 | 2021-02-02 | Apple Inc. | Device, method, and graphical user interface for displaying additional information in response to a user contact |
US10996788B2 (en) | 2012-05-09 | 2021-05-04 | Apple Inc. | Device, method, and graphical user interface for transitioning between display states in response to a gesture |
US11314407B2 (en) | 2012-05-09 | 2022-04-26 | Apple Inc. | Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object |
US11354033B2 (en) | 2012-05-09 | 2022-06-07 | Apple Inc. | Device, method, and graphical user interface for managing icons in a user interface region |
US11010027B2 (en) | 2012-05-09 | 2021-05-18 | Apple Inc. | Device, method, and graphical user interface for manipulating framed graphical objects |
US10969945B2 (en) | 2012-05-09 | 2021-04-06 | Apple Inc. | Device, method, and graphical user interface for selecting user interface objects |
US11023116B2 (en) | 2012-05-09 | 2021-06-01 | Apple Inc. | Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input |
US10775994B2 (en) | 2012-05-09 | 2020-09-15 | Apple Inc. | Device, method, and graphical user interface for moving and dropping a user interface object |
US11068153B2 (en) | 2012-05-09 | 2021-07-20 | Apple Inc. | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
US10775999B2 (en) | 2012-05-09 | 2020-09-15 | Apple Inc. | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
US9959241B2 (en) | 2012-05-14 | 2018-05-01 | Microsoft Technology Licensing, Llc | System and method for accessory device architecture that passes via intermediate processor a descriptor when processing in a low power state |
US9348605B2 (en) | 2012-05-14 | 2016-05-24 | Microsoft Technology Licensing, Llc | System and method for accessory device architecture that passes human interface device (HID) data via intermediate processor |
US9575555B2 (en) | 2012-06-08 | 2017-02-21 | Apple Inc. | Peek mode and graphical user interface (GUI) experience |
US10031556B2 (en) | 2012-06-08 | 2018-07-24 | Microsoft Technology Licensing, Llc | User experience adaptation |
US20130328917A1 (en) * | 2012-06-08 | 2013-12-12 | Apple Inc. | Smart cover peek |
US9256089B2 (en) | 2012-06-15 | 2016-02-09 | Microsoft Technology Licensing, Llc | Object-detecting backlight unit |
US10169924B2 (en) | 2012-08-22 | 2019-01-01 | Snaps Media Inc. | Augmented reality virtual content platform apparatuses, methods and systems |
US9721394B2 (en) | 2012-08-22 | 2017-08-01 | Snaps Media, Inc. | Augmented reality virtual content platform apparatuses, methods and systems |
US9792733B2 (en) | 2012-08-22 | 2017-10-17 | Snaps Media, Inc. | Augmented reality virtual content platform apparatuses, methods and systems |
US10887308B1 (en) | 2012-11-08 | 2021-01-05 | Snap Inc. | Interactive user-interface to adjust access privileges |
US11252158B2 (en) | 2012-11-08 | 2022-02-15 | Snap Inc. | Interactive user-interface to adjust access privileges |
US9882907B1 (en) | 2012-11-08 | 2018-01-30 | Snap Inc. | Apparatus and method for single action control of social network profile access |
US9760150B2 (en) | 2012-11-27 | 2017-09-12 | Nvidia Corporation | Low-power states for a computer system with integrated baseband |
US9474022B2 (en) | 2012-11-30 | 2016-10-18 | Nvidia Corporation | Saving power in a mobile terminal |
US10915243B2 (en) | 2012-12-29 | 2021-02-09 | Apple Inc. | Device, method, and graphical user interface for adjusting content selection |
US10671193B2 (en) * | 2013-01-16 | 2020-06-02 | Samsung Electronics Co., Ltd. | Mobile device and method for displaying information |
US20160154512A1 (en) * | 2013-01-16 | 2016-06-02 | Samsung Electronics Co., Ltd. | Mobile device and method for displaying information |
US20190306277A1 (en) * | 2013-02-25 | 2019-10-03 | Microsoft Technology Licensing, Llc | Interaction between devices displaying application status information |
US20140244715A1 (en) * | 2013-02-25 | 2014-08-28 | Microsoft Corporation | Interaction between devices displaying application status information |
US10122827B2 (en) * | 2013-02-25 | 2018-11-06 | Microsoft Technology Licensing, Llc | Interaction between devices displaying application status information |
US9304549B2 (en) | 2013-03-28 | 2016-04-05 | Microsoft Technology Licensing, Llc | Hinge mechanism for rotatable component attachment |
US20170115693A1 (en) * | 2013-04-25 | 2017-04-27 | Yonggui Li | Frameless Tablet |
US11509618B2 (en) | 2013-05-30 | 2022-11-22 | Snap Inc. | Maintaining a message thread with opt-in permanence for entries |
US10439972B1 (en) | 2013-05-30 | 2019-10-08 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US11115361B2 (en) | 2013-05-30 | 2021-09-07 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US11134046B2 (en) | 2013-05-30 | 2021-09-28 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US9742713B2 (en) | 2013-05-30 | 2017-08-22 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US9705831B2 (en) | 2013-05-30 | 2017-07-11 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US10587552B1 (en) | 2013-05-30 | 2020-03-10 | Snap Inc. | Apparatus and method for maintaining a message thread with opt-in permanence for entries |
US20140368423A1 (en) * | 2013-06-17 | 2014-12-18 | Nvidia Corporation | Method and system for low power gesture recognition for waking up mobile devices |
JP2015018180A (en) * | 2013-07-12 | 2015-01-29 | ソニー株式会社 | Information processing apparatus and storage medium |
US9798433B2 (en) * | 2013-07-18 | 2017-10-24 | Quickstep Technologies Llc | Guard accessory device for an electronic and/or computer apparatus, and apparatus equipped with such an accessory device |
US20160179247A1 (en) * | 2013-07-18 | 2016-06-23 | Fogale Nanotech | Guard accessory device for an electronic and/or computer apparatus, and apparatus equipped with such an accessory device |
US20150026623A1 (en) * | 2013-07-19 | 2015-01-22 | Apple Inc. | Device input modes with corresponding user interfaces |
US9645721B2 (en) * | 2013-07-19 | 2017-05-09 | Apple Inc. | Device input modes with corresponding cover configurations |
US8914752B1 (en) * | 2013-08-22 | 2014-12-16 | Snapchat, Inc. | Apparatus and method for accelerated display of ephemeral messages |
US10191528B2 (en) * | 2013-08-28 | 2019-01-29 | Apple Inc. | Sensor for detecting presence of material |
US20160334851A1 (en) * | 2013-08-28 | 2016-11-17 | Apple Inc. | Sensor for detecting presence of material |
US20150067578A1 (en) * | 2013-09-04 | 2015-03-05 | Samsung Electronics Co., Ltd | Apparatus and method for executing function in electronic device |
US9823728B2 (en) | 2013-09-04 | 2017-11-21 | Nvidia Corporation | Method and system for reduced rate touch scanning on an electronic device |
US11287942B2 (en) | 2013-09-09 | 2022-03-29 | Apple Inc. | Device, method, and graphical user interface for manipulating user interfaces |
US11494046B2 (en) | 2013-09-09 | 2022-11-08 | Apple Inc. | Device, method, and graphical user interface for manipulating user interfaces based on unlock inputs |
US11768575B2 (en) | 2013-09-09 | 2023-09-26 | Apple Inc. | Device, method, and graphical user interface for manipulating user interfaces based on unlock inputs |
US10803281B2 (en) | 2013-09-09 | 2020-10-13 | Apple Inc. | Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs |
US20150082255A1 (en) * | 2013-09-16 | 2015-03-19 | Motorola Mobility Llc | Methods and apparatus for displaying notification information |
US9881592B2 (en) | 2013-10-08 | 2018-01-30 | Nvidia Corporation | Hardware overlay assignment |
US10802652B2 (en) * | 2013-11-19 | 2020-10-13 | Quickstep Technologies Llc | Cover accessory device for a portable electronic and/or computer apparatus, and apparatus provided with such an accessory device |
US11231749B2 (en) | 2013-11-19 | 2022-01-25 | Quickstep Technologies Llc | Cover accessory device for a portable electronic and/or computer apparatus, and apparatus provided with such an accessory device |
US20180321771A1 (en) * | 2013-11-19 | 2018-11-08 | Quickstep Technologies Llc | Cover accessory device for a portable electronic and/or computer apparatus, and apparatus provided with such an accessory device |
US10042491B2 (en) | 2013-11-19 | 2018-08-07 | Quickstep Technologies Llc | Cover accessory device for a portable electronic and/or computer apparatus, and apparatus provided with such an accessory device |
US11493964B2 (en) | 2013-11-19 | 2022-11-08 | Quickstep Technologies Llc | Cover accessory device for a portable electronic and/or computer apparatus, and apparatus provided with such an accessory device |
US10069876B1 (en) | 2013-11-26 | 2018-09-04 | Snap Inc. | Method and system for integrating real time communication features in applications |
US11546388B2 (en) | 2013-11-26 | 2023-01-03 | Snap Inc. | Method and system for integrating real time communication features in applications |
US9794303B1 (en) | 2013-11-26 | 2017-10-17 | Snap Inc. | Method and system for integrating real time communication features in applications |
US11102253B2 (en) | 2013-11-26 | 2021-08-24 | Snap Inc. | Method and system for integrating real time communication features in applications |
US10681092B1 (en) | 2013-11-26 | 2020-06-09 | Snap Inc. | Method and system for integrating real time communication features in applications |
US9083770B1 (en) | 2013-11-26 | 2015-07-14 | Snapchat, Inc. | Method and system for integrating real time communication features in applications |
US9936030B2 (en) | 2014-01-03 | 2018-04-03 | Investel Capital Corporation | User content sharing system and method with location-based external content integration |
US10349209B1 (en) | 2014-01-12 | 2019-07-09 | Investment Asset Holdings Llc | Location-based messaging |
US10080102B1 (en) | 2014-01-12 | 2018-09-18 | Investment Asset Holdings Llc | Location-based messaging |
US9866999B1 (en) | 2014-01-12 | 2018-01-09 | Investment Asset Holdings Llc | Location-based messaging |
EP3110022A4 (en) * | 2014-02-19 | 2017-10-11 | LG Electronics Inc. | Mobile terminal and method for controlling same |
US11902235B2 (en) | 2014-02-21 | 2024-02-13 | Snap Inc. | Apparatus and method for alternate channel communication initiated through a common message thread |
US11463393B2 (en) | 2014-02-21 | 2022-10-04 | Snap Inc. | Apparatus and method for alternate channel communication initiated through a common message thread |
US11463394B2 (en) | 2014-02-21 | 2022-10-04 | Snap Inc. | Apparatus and method for alternate channel communication initiated through a common message thread |
US10084735B1 (en) | 2014-02-21 | 2018-09-25 | Snap Inc. | Apparatus and method for alternate channel communication initiated through a common message thread |
US10082926B1 (en) | 2014-02-21 | 2018-09-25 | Snap Inc. | Apparatus and method for alternate channel communication initiated through a common message thread |
US10949049B1 (en) | 2014-02-21 | 2021-03-16 | Snap Inc. | Apparatus and method for alternate channel communication initiated through a common message thread |
US10958605B1 (en) | 2014-02-21 | 2021-03-23 | Snap Inc. | Apparatus and method for alternate channel communication initiated through a common message thread |
US9407712B1 (en) | 2014-03-07 | 2016-08-02 | Snapchat, Inc. | Content delivery network for ephemeral objects |
US9237202B1 (en) | 2014-03-07 | 2016-01-12 | Snapchat, Inc. | Content delivery network for ephemeral objects |
US10002592B2 (en) * | 2014-03-28 | 2018-06-19 | Samsung Electronics Co., Ltd. | Displaying method of electronic device using a cover and electronic device thereof |
US20150278529A1 (en) * | 2014-03-28 | 2015-10-01 | Samsung Electronics Co., Ltd. | Displaying method of electronic device and electronic device thereof |
US9489130B2 (en) | 2014-04-01 | 2016-11-08 | Lg Electronics Inc. | Mobile terminal selectively acting only part of a display and control method thereof |
EP2927795A1 (en) * | 2014-04-01 | 2015-10-07 | LG Electronics Inc. | Mobile terminal and control method thereof |
US9276886B1 (en) | 2014-05-09 | 2016-03-01 | Snapchat, Inc. | Apparatus and method for dynamically configuring application component tiles |
US10817156B1 (en) | 2014-05-09 | 2020-10-27 | Snap Inc. | Dynamic configuration of application component tiles |
US11310183B2 (en) | 2014-05-09 | 2022-04-19 | Snap Inc. | Dynamic configuration of application component tiles |
US11743219B2 (en) | 2014-05-09 | 2023-08-29 | Snap Inc. | Dynamic configuration of application component tiles |
US10990697B2 (en) | 2014-05-28 | 2021-04-27 | Snap Inc. | Apparatus and method for automated privacy protection in distributed images |
US9396354B1 (en) | 2014-05-28 | 2016-07-19 | Snapchat, Inc. | Apparatus and method for automated privacy protection in distributed images |
US9785796B1 (en) | 2014-05-28 | 2017-10-10 | Snap Inc. | Apparatus and method for automated privacy protection in distributed images |
US10572681B1 (en) | 2014-05-28 | 2020-02-25 | Snap Inc. | Apparatus and method for automated privacy protection in distributed images |
US11836725B2 (en) | 2014-05-29 | 2023-12-05 | Apple Inc. | User interface for payments |
US10796309B2 (en) | 2014-05-29 | 2020-10-06 | Apple Inc. | User interface for payments |
US10748153B2 (en) | 2014-05-29 | 2020-08-18 | Apple Inc. | User interface for payments |
US10977651B2 (en) | 2014-05-29 | 2021-04-13 | Apple Inc. | User interface for payments |
US10902424B2 (en) | 2014-05-29 | 2021-01-26 | Apple Inc. | User interface for payments |
US11625443B2 (en) | 2014-06-05 | 2023-04-11 | Snap Inc. | Web document enhancement |
US11921805B2 (en) | 2014-06-05 | 2024-03-05 | Snap Inc. | Web document enhancement |
US10524087B1 (en) | 2014-06-13 | 2019-12-31 | Snap Inc. | Message destination list mechanism |
US9532171B2 (en) | 2014-06-13 | 2016-12-27 | Snap Inc. | Geo-location based event gallery |
US10779113B2 (en) | 2014-06-13 | 2020-09-15 | Snap Inc. | Prioritization of messages within a message collection |
US10182311B2 (en) | 2014-06-13 | 2019-01-15 | Snap Inc. | Prioritization of messages within a message collection |
US9430783B1 (en) | 2014-06-13 | 2016-08-30 | Snapchat, Inc. | Prioritization of messages within gallery |
US9113301B1 (en) | 2014-06-13 | 2015-08-18 | Snapchat, Inc. | Geo-location based event gallery |
US11166121B2 (en) | 2014-06-13 | 2021-11-02 | Snap Inc. | Prioritization of messages within a message collection |
US10200813B1 (en) | 2014-06-13 | 2019-02-05 | Snap Inc. | Geo-location based event gallery |
US9693191B2 (en) | 2014-06-13 | 2017-06-27 | Snap Inc. | Prioritization of messages within gallery |
US11317240B2 (en) | 2014-06-13 | 2022-04-26 | Snap Inc. | Geo-location based event gallery |
US10448201B1 (en) | 2014-06-13 | 2019-10-15 | Snap Inc. | Prioritization of messages within a message collection |
US10659914B1 (en) | 2014-06-13 | 2020-05-19 | Snap Inc. | Geo-location based event gallery |
US10623891B2 (en) | 2014-06-13 | 2020-04-14 | Snap Inc. | Prioritization of messages within a message collection |
US9825898B2 (en) | 2014-06-13 | 2017-11-21 | Snap Inc. | Prioritization of messages within a message collection |
US9094137B1 (en) | 2014-06-13 | 2015-07-28 | Snapchat, Inc. | Priority based placement of messages in a geo-location based event gallery |
US11496673B1 (en) | 2014-07-07 | 2022-11-08 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US10432850B1 (en) | 2014-07-07 | 2019-10-01 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US11122200B2 (en) | 2014-07-07 | 2021-09-14 | Snap Inc. | Supplying content aware photo filters |
US9225897B1 (en) | 2014-07-07 | 2015-12-29 | Snapchat, Inc. | Apparatus and method for supplying content aware photo filters |
US10701262B1 (en) | 2014-07-07 | 2020-06-30 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US10154192B1 (en) | 2014-07-07 | 2018-12-11 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US11595569B2 (en) | 2014-07-07 | 2023-02-28 | Snap Inc. | Supplying content aware photo filters |
US11849214B2 (en) | 2014-07-07 | 2023-12-19 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US10348960B1 (en) | 2014-07-07 | 2019-07-09 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US10602057B1 (en) | 2014-07-07 | 2020-03-24 | Snap Inc. | Supplying content aware photo filters |
US9407816B1 (en) | 2014-07-07 | 2016-08-02 | Snapchat, Inc. | Apparatus and method for supplying content aware photo filters |
US20160018942A1 (en) * | 2014-07-15 | 2016-01-21 | Lg Electronics Inc. | Mobile terminal and control method thereof |
US10055717B1 (en) | 2014-08-22 | 2018-08-21 | Snap Inc. | Message processor with application prompts |
US11017363B1 (en) | 2014-08-22 | 2021-05-25 | Snap Inc. | Message processor with application prompts |
US10423983B2 (en) | 2014-09-16 | 2019-09-24 | Snap Inc. | Determining targeting information based on a predictive targeting model |
US11625755B1 (en) | 2014-09-16 | 2023-04-11 | Foursquare Labs, Inc. | Determining targeting information based on a predictive targeting model |
US11281701B2 (en) | 2014-09-18 | 2022-03-22 | Snap Inc. | Geolocation-based pictographs |
US11741136B2 (en) | 2014-09-18 | 2023-08-29 | Snap Inc. | Geolocation-based pictographs |
US10824654B2 (en) | 2014-09-18 | 2020-11-03 | Snap Inc. | Geolocation-based pictographs |
US11216869B2 (en) | 2014-09-23 | 2022-01-04 | Snap Inc. | User interface to augment an image using geolocation |
KR20170063886A (en) * | 2014-09-30 | 2017-06-08 | 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 | Displaying content on a display in power save mode |
US10217401B2 (en) * | 2014-09-30 | 2019-02-26 | Microsoft Technology Licensing, Llc | Displaying content on a display in power save mode |
KR102393742B1 (en) | 2014-09-30 | 2022-05-02 | 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 | Displaying content on a display in power save mode |
US10476830B2 (en) | 2014-10-02 | 2019-11-12 | Snap Inc. | Ephemeral gallery of ephemeral messages |
US10708210B1 (en) | 2014-10-02 | 2020-07-07 | Snap Inc. | Multi-user ephemeral message gallery |
US11411908B1 (en) | 2014-10-02 | 2022-08-09 | Snap Inc. | Ephemeral message gallery user interface with online viewing history indicia |
US11038829B1 (en) | 2014-10-02 | 2021-06-15 | Snap Inc. | Ephemeral gallery of ephemeral messages with opt-in permanence |
US10944710B1 (en) | 2014-10-02 | 2021-03-09 | Snap Inc. | Ephemeral gallery user interface with remaining gallery time indication |
US10958608B1 (en) | 2014-10-02 | 2021-03-23 | Snap Inc. | Ephemeral gallery of visual media messages |
US20170374003A1 (en) | 2014-10-02 | 2017-12-28 | Snapchat, Inc. | Ephemeral gallery of ephemeral messages |
US10284508B1 (en) | 2014-10-02 | 2019-05-07 | Snap Inc. | Ephemeral gallery of ephemeral messages with opt-in permanence |
US11522822B1 (en) | 2014-10-02 | 2022-12-06 | Snap Inc. | Ephemeral gallery elimination based on gallery and message timers |
US11855947B1 (en) | 2014-10-02 | 2023-12-26 | Snap Inc. | Gallery of ephemeral messages |
US9537811B2 (en) | 2014-10-02 | 2017-01-03 | Snap Inc. | Ephemeral gallery of ephemeral messages |
US11012398B1 (en) | 2014-10-02 | 2021-05-18 | Snap Inc. | Ephemeral message gallery user interface with screenshot messages |
US10616476B1 (en) | 2014-11-12 | 2020-04-07 | Snap Inc. | User interface for accessing media at a geographic location |
US9843720B1 (en) | 2014-11-12 | 2017-12-12 | Snap Inc. | User interface for accessing media at a geographic location |
US11190679B2 (en) | 2014-11-12 | 2021-11-30 | Snap Inc. | Accessing media at a geographic location |
US11250887B2 (en) | 2014-12-19 | 2022-02-15 | Snap Inc. | Routing messages by message parameter |
US10580458B2 (en) | 2014-12-19 | 2020-03-03 | Snap Inc. | Gallery of videos set to an audio time line |
US9385983B1 (en) | 2014-12-19 | 2016-07-05 | Snapchat, Inc. | Gallery of messages from individuals with a shared interest |
US10811053B2 (en) | 2014-12-19 | 2020-10-20 | Snap Inc. | Routing messages by message parameter |
US11803345B2 (en) | 2014-12-19 | 2023-10-31 | Snap Inc. | Gallery of messages from individuals with a shared interest |
US10311916B2 (en) | 2014-12-19 | 2019-06-04 | Snap Inc. | Gallery of videos set to an audio time line |
US10514876B2 (en) | 2014-12-19 | 2019-12-24 | Snap Inc. | Gallery of messages from individuals with a shared interest |
US9854219B2 (en) | 2014-12-19 | 2017-12-26 | Snap Inc. | Gallery of videos set to an audio time line |
US11372608B2 (en) | 2014-12-19 | 2022-06-28 | Snap Inc. | Gallery of messages from individuals with a shared interest |
US11783862B2 (en) | 2014-12-19 | 2023-10-10 | Snap Inc. | Routing messages by message parameter |
US10120563B2 (en) * | 2014-12-24 | 2018-11-06 | Intel Corporation | User interface for liquid container |
US11169703B2 (en) | 2014-12-30 | 2021-11-09 | Huawei Technologies Co., Ltd. | Method for displaying graphical user interface and mobile terminal |
CN109901770A (en) * | 2014-12-30 | 2019-06-18 | 华为终端有限公司 | A kind of display methods and mobile terminal of graphic user interface |
US11429276B2 (en) | 2014-12-30 | 2022-08-30 | Huawei Technologies Co., Ltd. | Method for displaying graphical user interface and mobile terminal |
US10380720B1 (en) | 2015-01-09 | 2019-08-13 | Snap Inc. | Location-based image filters |
US11301960B2 (en) | 2015-01-09 | 2022-04-12 | Snap Inc. | Object recognition based image filters |
US11734342B2 (en) | 2015-01-09 | 2023-08-22 | Snap Inc. | Object recognition based image overlays |
US10157449B1 (en) | 2015-01-09 | 2018-12-18 | Snap Inc. | Geo-location-based image filters |
US11388226B1 (en) | 2015-01-13 | 2022-07-12 | Snap Inc. | Guided personal identity based actions |
US11249617B1 (en) | 2015-01-19 | 2022-02-15 | Snap Inc. | Multichannel system |
US10416845B1 (en) | 2015-01-19 | 2019-09-17 | Snap Inc. | Multichannel system |
US10133705B1 (en) | 2015-01-19 | 2018-11-20 | Snap Inc. | Multichannel system |
US11528579B2 (en) | 2015-01-26 | 2022-12-13 | Snap Inc. | Content request by location |
US11910267B2 (en) | 2015-01-26 | 2024-02-20 | Snap Inc. | Content request by location |
US10932085B1 (en) | 2015-01-26 | 2021-02-23 | Snap Inc. | Content request by location |
US10536800B1 (en) | 2015-01-26 | 2020-01-14 | Snap Inc. | Content request by location |
US10123166B2 (en) | 2015-01-26 | 2018-11-06 | Snap Inc. | Content request by location |
US9612625B2 (en) | 2015-03-02 | 2017-04-04 | Blackberry Limited | System and method of rendering data based on an angle of a carrying case flap |
EP3065031A1 (en) * | 2015-03-02 | 2016-09-07 | BlackBerry Limited | System and method of rendering data based on an angle of a carrying case flap |
US11112957B2 (en) | 2015-03-08 | 2021-09-07 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
US10860177B2 (en) | 2015-03-08 | 2020-12-08 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10223397B1 (en) | 2015-03-13 | 2019-03-05 | Snap Inc. | Social graph based co-location of network users |
US10893055B2 (en) | 2015-03-18 | 2021-01-12 | Snap Inc. | Geo-fence authorization provisioning |
US10616239B2 (en) | 2015-03-18 | 2020-04-07 | Snap Inc. | Geo-fence authorization provisioning |
US11902287B2 (en) | 2015-03-18 | 2024-02-13 | Snap Inc. | Geo-fence authorization provisioning |
US11550471B2 (en) | 2015-03-19 | 2023-01-10 | Apple Inc. | Touch input cursor manipulation |
US11054990B2 (en) | 2015-03-19 | 2021-07-06 | Apple Inc. | Touch input cursor manipulation |
US10948717B1 (en) | 2015-03-23 | 2021-03-16 | Snap Inc. | Reducing boot time and power consumption in wearable display systems |
US11320651B2 (en) | 2015-03-23 | 2022-05-03 | Snap Inc. | Reducing boot time and power consumption in displaying data content |
US11662576B2 (en) | 2015-03-23 | 2023-05-30 | Snap Inc. | Reducing boot time and power consumption in displaying data content |
US11392633B2 (en) | 2015-05-05 | 2022-07-19 | Snap Inc. | Systems and methods for automated local story generation and curation |
US11496544B2 (en) | 2015-05-05 | 2022-11-08 | Snap Inc. | Story and sub-story navigation |
US10592574B2 (en) | 2015-05-05 | 2020-03-17 | Snap Inc. | Systems and methods for automated local story generation and curation |
US11449539B2 (en) | 2015-05-05 | 2022-09-20 | Snap Inc. | Automated local story generation and curation |
US9881094B2 (en) | 2015-05-05 | 2018-01-30 | Snap Inc. | Systems and methods for automated local story generation and curation |
US10911575B1 (en) | 2015-05-05 | 2021-02-02 | Snap Inc. | Systems and methods for story and sub-story navigation |
US10135949B1 (en) | 2015-05-05 | 2018-11-20 | Snap Inc. | Systems and methods for story and sub-story navigation |
US11321731B2 (en) | 2015-06-05 | 2022-05-03 | Apple Inc. | User interface for loyalty accounts and private label accounts |
US11734708B2 (en) | 2015-06-05 | 2023-08-22 | Apple Inc. | User interface for loyalty accounts and private label accounts |
US11783305B2 (en) | 2015-06-05 | 2023-10-10 | Apple Inc. | User interface for loyalty accounts and private label accounts for a wearable device |
US11681429B2 (en) | 2015-06-07 | 2023-06-20 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US11240424B2 (en) | 2015-06-07 | 2022-02-01 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US11231831B2 (en) | 2015-06-07 | 2022-01-25 | Apple Inc. | Devices and methods for content preview based on touch input intensity |
US10841484B2 (en) | 2015-06-07 | 2020-11-17 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US10705718B2 (en) | 2015-06-07 | 2020-07-07 | Apple Inc. | Devices and methods for navigating between user interfaces |
US11835985B2 (en) | 2015-06-07 | 2023-12-05 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US10993069B2 (en) | 2015-07-16 | 2021-04-27 | Snap Inc. | Dynamically adaptive media content delivery |
US20170031591A1 (en) * | 2015-07-31 | 2017-02-02 | Samsung Electronics Co., Ltd. | Screen controlling method and electronic device for supporting the same |
US10754542B2 (en) | 2015-08-10 | 2020-08-25 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10963158B2 (en) | 2015-08-10 | 2021-03-30 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10698598B2 (en) | 2015-08-10 | 2020-06-30 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US11740785B2 (en) | 2015-08-10 | 2023-08-29 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US11327648B2 (en) | 2015-08-10 | 2022-05-10 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US11182017B2 (en) | 2015-08-10 | 2021-11-23 | Apple Inc. | Devices and methods for processing touch inputs based on their intensities |
US10884608B2 (en) | 2015-08-10 | 2021-01-05 | Apple Inc. | Devices, methods, and graphical user interfaces for content navigation and manipulation |
US10817898B2 (en) | 2015-08-13 | 2020-10-27 | Placed, Llc | Determining exposures to content presented by physical objects |
US9967649B2 (en) | 2015-09-30 | 2018-05-08 | Apple Inc. | Wireless pairing of earbuds and case |
US9967650B2 (en) | 2015-09-30 | 2018-05-08 | Apple Inc. | Case with inductive charging system to charge a portable device |
US10009678B2 (en) | 2015-09-30 | 2018-06-26 | Apple Inc. | Earbud case with receptacle connector for earbuds |
US10003880B2 (en) | 2015-09-30 | 2018-06-19 | Apple Inc. | Wireless earbuds with electronic contacts |
US10212506B2 (en) | 2015-09-30 | 2019-02-19 | Apple Inc. | Case with magnetic over-center mechanism |
US10225637B2 (en) | 2015-09-30 | 2019-03-05 | Apple Inc. | Magnetic retention of earbud within cavity |
US10003881B2 (en) | 2015-09-30 | 2018-06-19 | Apple Inc. | Earbuds with capacitive touch sensor |
US9973845B2 (en) | 2015-09-30 | 2018-05-15 | Apple Inc. | Earbuds with acoustic insert |
US11026011B2 (en) | 2015-09-30 | 2021-06-01 | Apple Inc. | Wireless earbud |
US11026010B2 (en) | 2015-09-30 | 2021-06-01 | Apple Inc. | Portable listening device with sensors |
US10904652B2 (en) | 2015-09-30 | 2021-01-26 | Apple Inc. | Earbud case with insert |
US9973840B2 (en) | 2015-09-30 | 2018-05-15 | Apple Inc. | Waterproof receptacle connector |
US9967648B2 (en) | 2015-09-30 | 2018-05-08 | Apple Inc. | Case with magnetic over-center mechanism |
US10097913B2 (en) | 2015-09-30 | 2018-10-09 | Apple Inc. | Earbud case with charging system |
US10880630B2 (en) | 2015-09-30 | 2020-12-29 | Apple Inc. | Wireless earbud |
US9967644B2 (en) | 2015-09-30 | 2018-05-08 | Apple Inc. | Magnetic retention of earbud within cavity |
US9961431B2 (en) | 2015-09-30 | 2018-05-01 | Apple Inc. | Earbud case with wireless radio shutdown feature |
US9961433B2 (en) | 2015-09-30 | 2018-05-01 | Apple Inc. | Case with inductive charging system to charge a portable device |
US10397683B2 (en) | 2015-09-30 | 2019-08-27 | Apple Inc. | Case with torsion spring over-center mechanism |
US10397682B2 (en) | 2015-09-30 | 2019-08-27 | Apple Inc. | Earbuds with acoustic insert |
US11690428B2 (en) | 2015-09-30 | 2023-07-04 | Apple Inc. | Portable listening device with accelerometer |
US10681446B2 (en) | 2015-09-30 | 2020-06-09 | Apple Inc. | Earbud case with pairing button |
US11944172B2 (en) | 2015-09-30 | 2024-04-02 | Apple Inc. | Portable listening device with sensors |
US10182282B2 (en) | 2015-09-30 | 2019-01-15 | Apple Inc. | Earbud case with charging system |
EP3151088B1 (en) * | 2015-10-02 | 2019-11-06 | BlackBerry Limited | Method and apparatus for movable assembly position sensing and virtual keyboard display |
US11315331B2 (en) | 2015-10-30 | 2022-04-26 | Snap Inc. | Image based tracking in augmented reality systems |
US11769307B2 (en) | 2015-10-30 | 2023-09-26 | Snap Inc. | Image based tracking in augmented reality systems |
US10733802B2 (en) | 2015-10-30 | 2020-08-04 | Snap Inc. | Image based tracking in augmented reality systems |
US10102680B2 (en) | 2015-10-30 | 2018-10-16 | Snap Inc. | Image based tracking in augmented reality systems |
US10366543B1 (en) | 2015-10-30 | 2019-07-30 | Snap Inc. | Image based tracking in augmented reality systems |
US10997783B2 (en) | 2015-11-30 | 2021-05-04 | Snap Inc. | Image and point cloud based tracking and in augmented reality systems |
US11599241B2 (en) | 2015-11-30 | 2023-03-07 | Snap Inc. | Network resource location linking and visual content sharing |
US10474321B2 (en) | 2015-11-30 | 2019-11-12 | Snap Inc. | Network resource location linking and visual content sharing |
US10657708B1 (en) | 2015-11-30 | 2020-05-19 | Snap Inc. | Image and point cloud based tracking and in augmented reality systems |
US11380051B2 (en) | 2015-11-30 | 2022-07-05 | Snap Inc. | Image and point cloud based tracking and in augmented reality systems |
US11830117B2 (en) | 2015-12-18 | 2023-11-28 | Snap Inc | Media overlay publication system |
US11468615B2 (en) | 2015-12-18 | 2022-10-11 | Snap Inc. | Media overlay publication system |
US10997758B1 (en) | 2015-12-18 | 2021-05-04 | Snap Inc. | Media overlay publication system |
US10354425B2 (en) | 2015-12-18 | 2019-07-16 | Snap Inc. | Method and system for providing context relevant media augmentation |
US10834525B2 (en) | 2016-02-26 | 2020-11-10 | Snap Inc. | Generation, curation, and presentation of media collections |
US11611846B2 (en) | 2016-02-26 | 2023-03-21 | Snap Inc. | Generation, curation, and presentation of media collections |
US11197123B2 (en) | 2016-02-26 | 2021-12-07 | Snap Inc. | Generation, curation, and presentation of media collections |
US11889381B2 (en) | 2016-02-26 | 2024-01-30 | Snap Inc. | Generation, curation, and presentation of media collections |
US11023514B2 (en) | 2016-02-26 | 2021-06-01 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections |
US10679389B2 (en) | 2016-02-26 | 2020-06-09 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections |
US11631276B2 (en) | 2016-03-31 | 2023-04-18 | Snap Inc. | Automated avatar generation |
US11900418B2 (en) | 2016-04-04 | 2024-02-13 | Snap Inc. | Mutable geo-fencing system |
US11206309B2 (en) | 2016-05-19 | 2021-12-21 | Apple Inc. | User interface for remote authorization |
US10749967B2 (en) | 2016-05-19 | 2020-08-18 | Apple Inc. | User interface for remote authorization |
US11481769B2 (en) | 2016-06-11 | 2022-10-25 | Apple Inc. | User interface for transactions |
US11900372B2 (en) | 2016-06-12 | 2024-02-13 | Apple Inc. | User interfaces for transactions |
AU2020201019B2 (en) * | 2016-06-12 | 2021-07-08 | Apple Inc. | Devices and methods for accessing prevalent device functions |
WO2017218153A1 (en) * | 2016-06-12 | 2017-12-21 | Apple Inc. | Devices and methods for accessing prevalent device functions |
JP2020129380A (en) * | 2016-06-12 | 2020-08-27 | アップル インコーポレイテッドApple Inc. | Device and method for accessing general device function |
EP4040266A1 (en) * | 2016-06-12 | 2022-08-10 | Apple Inc. | Devices and methods for accessing prevalent device functions |
US10712934B2 (en) | 2016-06-12 | 2020-07-14 | Apple Inc. | Devices and methods for accessing prevalent device functions |
CN107491257A (en) * | 2016-06-12 | 2017-12-19 | 苹果公司 | For accessing the apparatus and method of common equipment function |
US11037150B2 (en) | 2016-06-12 | 2021-06-15 | Apple Inc. | User interfaces for transactions |
US11044393B1 (en) | 2016-06-20 | 2021-06-22 | Pipbin, Inc. | System for curation and display of location-dependent augmented reality content in an augmented estate system |
US10839219B1 (en) | 2016-06-20 | 2020-11-17 | Pipbin, Inc. | System for curation, distribution and display of location-dependent augmented reality content |
US10805696B1 (en) | 2016-06-20 | 2020-10-13 | Pipbin, Inc. | System for recording and targeting tagged content of user interest |
US10638256B1 (en) | 2016-06-20 | 2020-04-28 | Pipbin, Inc. | System for distribution and display of mobile targeted augmented reality content |
US11785161B1 (en) | 2016-06-20 | 2023-10-10 | Pipbin, Inc. | System for user accessibility of tagged curated augmented reality content |
US11201981B1 (en) | 2016-06-20 | 2021-12-14 | Pipbin, Inc. | System for notification of user accessibility of curated location-dependent content in an augmented estate |
US10992836B2 (en) | 2016-06-20 | 2021-04-27 | Pipbin, Inc. | Augmented property system of curated augmented reality media elements |
US11876941B1 (en) | 2016-06-20 | 2024-01-16 | Pipbin, Inc. | Clickable augmented reality content manager, system, and network |
US10735892B2 (en) | 2016-06-28 | 2020-08-04 | Snap Inc. | System to track engagement of media items |
US10506371B2 (en) | 2016-06-28 | 2019-12-10 | Snap Inc. | System to track engagement of media items |
US11640625B2 (en) | 2016-06-28 | 2023-05-02 | Snap Inc. | Generation, curation, and presentation of media collections with automated advertising |
US10785597B2 (en) | 2016-06-28 | 2020-09-22 | Snap Inc. | System to track engagement of media items |
US10430838B1 (en) | 2016-06-28 | 2019-10-01 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections with automated advertising |
US10219110B2 (en) | 2016-06-28 | 2019-02-26 | Snap Inc. | System to track engagement of media items |
US10885559B1 (en) | 2016-06-28 | 2021-01-05 | Snap Inc. | Generation, curation, and presentation of media collections with automated advertising |
US10165402B1 (en) | 2016-06-28 | 2018-12-25 | Snap Inc. | System to track engagement of media items |
US11445326B2 (en) | 2016-06-28 | 2022-09-13 | Snap Inc. | Track engagement of media items |
US10327100B1 (en) | 2016-06-28 | 2019-06-18 | Snap Inc. | System to track engagement of media items |
US11895068B2 (en) | 2016-06-30 | 2024-02-06 | Snap Inc. | Automated content curation and communication |
US11080351B1 (en) | 2016-06-30 | 2021-08-03 | Snap Inc. | Automated content curation and communication |
US10387514B1 (en) | 2016-06-30 | 2019-08-20 | Snap Inc. | Automated content curation and communication |
US11509615B2 (en) | 2016-07-19 | 2022-11-22 | Snap Inc. | Generating customized electronic messaging graphics |
US10348662B2 (en) | 2016-07-19 | 2019-07-09 | Snap Inc. | Generating customized electronic messaging graphics |
US11816853B2 (en) | 2016-08-30 | 2023-11-14 | Snap Inc. | Systems and methods for simultaneous localization and mapping |
US11074572B2 (en) | 2016-09-06 | 2021-07-27 | Apple Inc. | User interfaces for stored-value accounts |
US11843456B2 (en) | 2016-10-24 | 2023-12-12 | Snap Inc. | Generating and displaying customized avatars in media overlays |
US11876762B1 (en) | 2016-10-24 | 2024-01-16 | Snap Inc. | Generating and displaying customized avatars in media overlays |
US11574041B2 (en) | 2016-10-25 | 2023-02-07 | Apple Inc. | User interface for managing access to credentials for use in an operation |
US10325567B2 (en) * | 2016-11-01 | 2019-06-18 | Hyundai Motor Company | Vehicle and method for controlling the same |
US11233952B2 (en) | 2016-11-07 | 2022-01-25 | Snap Inc. | Selective identification and order of image modifiers |
US11750767B2 (en) | 2016-11-07 | 2023-09-05 | Snap Inc. | Selective identification and order of image modifiers |
US10623666B2 (en) | 2016-11-07 | 2020-04-14 | Snap Inc. | Selective identification and order of image modifiers |
US11397517B2 (en) | 2016-12-09 | 2022-07-26 | Snap Inc. | Customized media overlays |
US10754525B1 (en) | 2016-12-09 | 2020-08-25 | Snap Inc. | Customized media overlays |
US10203855B2 (en) | 2016-12-09 | 2019-02-12 | Snap Inc. | Customized user-controlled media overlays |
US11616745B2 (en) | 2017-01-09 | 2023-03-28 | Snap Inc. | Contextual generation and selection of customized media content |
US20180225030A1 (en) * | 2017-01-17 | 2018-08-09 | Nanoport Technology Inc. | Electronic device having force-based modifiable graphical elements and method of operating same |
US11870743B1 (en) | 2017-01-23 | 2024-01-09 | Snap Inc. | Customized digital avatar accessories |
US10915911B2 (en) | 2017-02-03 | 2021-02-09 | Snap Inc. | System to determine a price-schedule to distribute media content |
US11720640B2 (en) | 2017-02-17 | 2023-08-08 | Snap Inc. | Searching social media content |
US10319149B1 (en) | 2017-02-17 | 2019-06-11 | Snap Inc. | Augmented reality anamorphosis system |
US11861795B1 (en) | 2017-02-17 | 2024-01-02 | Snap Inc. | Augmented reality anamorphosis system |
US11250075B1 (en) | 2017-02-17 | 2022-02-15 | Snap Inc. | Searching social media content |
US10614828B1 (en) | 2017-02-20 | 2020-04-07 | Snap Inc. | Augmented reality speech balloon system |
US11189299B1 (en) | 2017-02-20 | 2021-11-30 | Snap Inc. | Augmented reality speech balloon system |
US11748579B2 (en) | 2017-02-20 | 2023-09-05 | Snap Inc. | Augmented reality speech balloon system |
US11037372B2 (en) | 2017-03-06 | 2021-06-15 | Snap Inc. | Virtual vision system |
US11670057B2 (en) | 2017-03-06 | 2023-06-06 | Snap Inc. | Virtual vision system |
US10887269B1 (en) | 2017-03-09 | 2021-01-05 | Snap Inc. | Restricted group content collection |
US10523625B1 (en) | 2017-03-09 | 2019-12-31 | Snap Inc. | Restricted group content collection |
US11258749B2 (en) | 2017-03-09 | 2022-02-22 | Snap Inc. | Restricted group content collection |
US10581782B2 (en) | 2017-03-27 | 2020-03-03 | Snap Inc. | Generating a stitched data stream |
US11349796B2 (en) | 2017-03-27 | 2022-05-31 | Snap Inc. | Generating a stitched data stream |
US10582277B2 (en) | 2017-03-27 | 2020-03-03 | Snap Inc. | Generating a stitched data stream |
US11558678B2 (en) | 2017-03-27 | 2023-01-17 | Snap Inc. | Generating a stitched data stream |
US11297399B1 (en) | 2017-03-27 | 2022-04-05 | Snap Inc. | Generating a stitched data stream |
US11170393B1 (en) | 2017-04-11 | 2021-11-09 | Snap Inc. | System to calculate an engagement score of location based media content |
US11195018B1 (en) | 2017-04-20 | 2021-12-07 | Snap Inc. | Augmented reality typography personalization system |
US10387730B1 (en) | 2017-04-20 | 2019-08-20 | Snap Inc. | Augmented reality typography personalization system |
US11474663B2 (en) | 2017-04-27 | 2022-10-18 | Snap Inc. | Location-based search mechanism in a graphical user interface |
US11409407B2 (en) | 2017-04-27 | 2022-08-09 | Snap Inc. | Map-based graphical user interface indicating geospatial activity metrics |
US11782574B2 (en) | 2017-04-27 | 2023-10-10 | Snap Inc. | Map-based graphical user interface indicating geospatial activity metrics |
US11385763B2 (en) | 2017-04-27 | 2022-07-12 | Snap Inc. | Map-based graphical user interface indicating geospatial activity metrics |
US11893647B2 (en) | 2017-04-27 | 2024-02-06 | Snap Inc. | Location-based virtual avatars |
US11842411B2 (en) | 2017-04-27 | 2023-12-12 | Snap Inc. | Location-based virtual avatars |
US10963529B1 (en) | 2017-04-27 | 2021-03-30 | Snap Inc. | Location-based search mechanism in a graphical user interface |
US11451956B1 (en) | 2017-04-27 | 2022-09-20 | Snap Inc. | Location privacy management on map-based social media platforms |
US11556221B2 (en) | 2017-04-27 | 2023-01-17 | Snap Inc. | Friend location sharing mechanism for social media platforms |
US11418906B2 (en) | 2017-04-27 | 2022-08-16 | Snap Inc. | Selective location-based identity communication |
US10952013B1 (en) | 2017-04-27 | 2021-03-16 | Snap Inc. | Selective location-based identity communication |
US11392264B1 (en) | 2017-04-27 | 2022-07-19 | Snap Inc. | Map-based graphical user interface for multi-type social media galleries |
US11232040B1 (en) | 2017-04-28 | 2022-01-25 | Snap Inc. | Precaching unlockable data elements |
US11675831B2 (en) | 2017-05-31 | 2023-06-13 | Snap Inc. | Geolocation based playlists |
US10921873B2 (en) * | 2017-08-14 | 2021-02-16 | Samsung Electronics Co., Ltd. | Method for displaying content and electronic device thereof |
US20190050045A1 (en) * | 2017-08-14 | 2019-02-14 | Samsung Electronics Co., Ltd. | Method for displaying content and electronic device thereof |
US11475254B1 (en) | 2017-09-08 | 2022-10-18 | Snap Inc. | Multimodal entity identification |
US10783227B2 (en) | 2017-09-09 | 2020-09-22 | Apple Inc. | Implementation of biometric authentication |
US11386189B2 (en) | 2017-09-09 | 2022-07-12 | Apple Inc. | Implementation of biometric authentication |
US10521579B2 (en) | 2017-09-09 | 2019-12-31 | Apple Inc. | Implementation of biometric authentication |
US11765163B2 (en) | 2017-09-09 | 2023-09-19 | Apple Inc. | Implementation of biometric authentication |
US11393258B2 (en) | 2017-09-09 | 2022-07-19 | Apple Inc. | Implementation of biometric authentication |
US10872256B2 (en) | 2017-09-09 | 2020-12-22 | Apple Inc. | Implementation of biometric authentication |
US11721080B2 (en) | 2017-09-15 | 2023-08-08 | Snap Inc. | Augmented reality system |
US10740974B1 (en) | 2017-09-15 | 2020-08-11 | Snap Inc. | Augmented reality system |
US11335067B2 (en) | 2017-09-15 | 2022-05-17 | Snap Inc. | Augmented reality system |
US10499191B1 (en) | 2017-10-09 | 2019-12-03 | Snap Inc. | Context sensitive presentation of content |
US11006242B1 (en) | 2017-10-09 | 2021-05-11 | Snap Inc. | Context sensitive presentation of content |
US11617056B2 (en) | 2017-10-09 | 2023-03-28 | Snap Inc. | Context sensitive presentation of content |
US11670025B2 (en) | 2017-10-30 | 2023-06-06 | Snap Inc. | Mobile-based cartographic control of display content |
US11030787B2 (en) | 2017-10-30 | 2021-06-08 | Snap Inc. | Mobile-based cartographic control of display content |
US11558327B2 (en) | 2017-12-01 | 2023-01-17 | Snap Inc. | Dynamic media overlay with smart widget |
US11943185B2 (en) | 2017-12-01 | 2024-03-26 | Snap Inc. | Dynamic media overlay with smart widget |
US11265273B1 (en) | 2017-12-01 | 2022-03-01 | Snap, Inc. | Dynamic media overlay with smart widget |
US11687720B2 (en) | 2017-12-22 | 2023-06-27 | Snap Inc. | Named entity recognition visual context and caption data |
US11017173B1 (en) | 2017-12-22 | 2021-05-25 | Snap Inc. | Named entity recognition visual context and caption data |
US10678818B2 (en) | 2018-01-03 | 2020-06-09 | Snap Inc. | Tag distribution visualization system |
US11487794B2 (en) | 2018-01-03 | 2022-11-01 | Snap Inc. | Tag distribution visualization system |
US11507614B1 (en) | 2018-02-13 | 2022-11-22 | Snap Inc. | Icon based tagging |
US11841896B2 (en) | 2018-02-13 | 2023-12-12 | Snap Inc. | Icon based tagging |
US10885136B1 (en) | 2018-02-28 | 2021-01-05 | Snap Inc. | Audience filtering system |
US10979752B1 (en) | 2018-02-28 | 2021-04-13 | Snap Inc. | Generating media content items based on location information |
US11523159B2 (en) | 2018-02-28 | 2022-12-06 | Snap Inc. | Generating media content items based on location information |
US10524088B2 (en) | 2018-03-06 | 2019-12-31 | Snap Inc. | Geo-fence selection system |
US10327096B1 (en) | 2018-03-06 | 2019-06-18 | Snap Inc. | Geo-fence selection system |
US11044574B2 (en) | 2018-03-06 | 2021-06-22 | Snap Inc. | Geo-fence selection system |
US11722837B2 (en) | 2018-03-06 | 2023-08-08 | Snap Inc. | Geo-fence selection system |
US11570572B2 (en) | 2018-03-06 | 2023-01-31 | Snap Inc. | Geo-fence selection system |
US11491393B2 (en) | 2018-03-14 | 2022-11-08 | Snap Inc. | Generating collectible items based on location information |
US10933311B2 (en) | 2018-03-14 | 2021-03-02 | Snap Inc. | Generating collectible items based on location information |
US11163941B1 (en) | 2018-03-30 | 2021-11-02 | Snap Inc. | Annotating a collection of media content items |
US10924886B2 (en) | 2018-04-18 | 2021-02-16 | Snap Inc. | Visitation tracking system |
US10448199B1 (en) | 2018-04-18 | 2019-10-15 | Snap Inc. | Visitation tracking system |
US11683657B2 (en) | 2018-04-18 | 2023-06-20 | Snap Inc. | Visitation tracking system |
US10779114B2 (en) | 2018-04-18 | 2020-09-15 | Snap Inc. | Visitation tracking system |
US10681491B1 (en) | 2018-04-18 | 2020-06-09 | Snap Inc. | Visitation tracking system |
US10219111B1 (en) | 2018-04-18 | 2019-02-26 | Snap Inc. | Visitation tracking system |
US11297463B2 (en) | 2018-04-18 | 2022-04-05 | Snap Inc. | Visitation tracking system |
US11860888B2 (en) | 2018-05-22 | 2024-01-02 | Snap Inc. | Event detection system |
US11170085B2 (en) | 2018-06-03 | 2021-11-09 | Apple Inc. | Implementation of biometric authentication |
US11928200B2 (en) | 2018-06-03 | 2024-03-12 | Apple Inc. | Implementation of biometric authentication |
US11367234B2 (en) | 2018-07-24 | 2022-06-21 | Snap Inc. | Conditional modification of augmented reality object |
US11670026B2 (en) | 2018-07-24 | 2023-06-06 | Snap Inc. | Conditional modification of augmented reality object |
US10789749B2 (en) | 2018-07-24 | 2020-09-29 | Snap Inc. | Conditional modification of augmented reality object |
US10943381B2 (en) | 2018-07-24 | 2021-03-09 | Snap Inc. | Conditional modification of augmented reality object |
US10679393B2 (en) | 2018-07-24 | 2020-06-09 | Snap Inc. | Conditional modification of augmented reality object |
US10997760B2 (en) | 2018-08-31 | 2021-05-04 | Snap Inc. | Augmented reality anthropomorphization system |
US11676319B2 (en) | 2018-08-31 | 2023-06-13 | Snap Inc. | Augmented reality anthropomorphtzation system |
US11450050B2 (en) | 2018-08-31 | 2022-09-20 | Snap Inc. | Augmented reality anthropomorphization system |
US11172101B1 (en) | 2018-09-20 | 2021-11-09 | Apple Inc. | Multifunction accessory case |
US11704005B2 (en) | 2018-09-28 | 2023-07-18 | Snap Inc. | Collaborative achievement interface |
US11100349B2 (en) | 2018-09-28 | 2021-08-24 | Apple Inc. | Audio assisted enrollment |
US11455082B2 (en) | 2018-09-28 | 2022-09-27 | Snap Inc. | Collaborative achievement interface |
US11809784B2 (en) | 2018-09-28 | 2023-11-07 | Apple Inc. | Audio assisted enrollment |
US11619991B2 (en) | 2018-09-28 | 2023-04-04 | Apple Inc. | Device control using gaze information |
US10860096B2 (en) | 2018-09-28 | 2020-12-08 | Apple Inc. | Device control using gaze information |
US11799811B2 (en) | 2018-10-31 | 2023-10-24 | Snap Inc. | Messaging and gaming applications communication platform |
CN111192547A (en) * | 2018-11-14 | 2020-05-22 | 乐金显示有限公司 | Foldable display and driving method thereof |
US11558709B2 (en) | 2018-11-30 | 2023-01-17 | Snap Inc. | Position service to determine relative position to map features |
US11199957B1 (en) | 2018-11-30 | 2021-12-14 | Snap Inc. | Generating customized avatars based on location information |
US11812335B2 (en) | 2018-11-30 | 2023-11-07 | Snap Inc. | Position service to determine relative position to map features |
US11698722B2 (en) | 2018-11-30 | 2023-07-11 | Snap Inc. | Generating customized avatars based on location information |
US11877211B2 (en) | 2019-01-14 | 2024-01-16 | Snap Inc. | Destination sharing in location sharing system |
US11751015B2 (en) | 2019-01-16 | 2023-09-05 | Snap Inc. | Location-based context information sharing in a messaging system |
US11693887B2 (en) | 2019-01-30 | 2023-07-04 | Snap Inc. | Adaptive spatial density based clustering |
US11294936B1 (en) | 2019-01-30 | 2022-04-05 | Snap Inc. | Adaptive spatial density based clustering |
US11809624B2 (en) | 2019-02-13 | 2023-11-07 | Snap Inc. | Sleep detection in a location sharing system |
US11500525B2 (en) | 2019-02-25 | 2022-11-15 | Snap Inc. | Custom media overlay system |
US11574431B2 (en) | 2019-02-26 | 2023-02-07 | Snap Inc. | Avatar based on weather |
US11301117B2 (en) | 2019-03-08 | 2022-04-12 | Snap Inc. | Contextual information in chat |
US11868414B1 (en) | 2019-03-14 | 2024-01-09 | Snap Inc. | Graph-based prediction for contact suggestion in a location sharing system |
US11852554B1 (en) | 2019-03-21 | 2023-12-26 | Snap Inc. | Barometer calibration in a location sharing system |
US11669896B2 (en) | 2019-03-24 | 2023-06-06 | Apple Inc. | User interfaces for managing an account |
US11328352B2 (en) | 2019-03-24 | 2022-05-10 | Apple Inc. | User interfaces for managing an account |
US11688001B2 (en) | 2019-03-24 | 2023-06-27 | Apple Inc. | User interfaces for managing an account |
US11610259B2 (en) | 2019-03-24 | 2023-03-21 | Apple Inc. | User interfaces for managing an account |
US10783576B1 (en) | 2019-03-24 | 2020-09-22 | Apple Inc. | User interfaces for managing an account |
US11249614B2 (en) | 2019-03-28 | 2022-02-15 | Snap Inc. | Generating personalized map interface with enhanced icons |
US11740760B2 (en) | 2019-03-28 | 2023-08-29 | Snap Inc. | Generating personalized map interface with enhanced icons |
US11361493B2 (en) | 2019-04-01 | 2022-06-14 | Snap Inc. | Semantic texture mapping system |
US11206615B2 (en) | 2019-05-30 | 2021-12-21 | Snap Inc. | Wearable device location systems |
US11785549B2 (en) | 2019-05-30 | 2023-10-10 | Snap Inc. | Wearable device location systems |
US11606755B2 (en) | 2019-05-30 | 2023-03-14 | Snap Inc. | Wearable device location systems architecture |
US11917495B2 (en) | 2019-06-07 | 2024-02-27 | Snap Inc. | Detection of a physical collision between two client devices in a location sharing system |
US11601783B2 (en) | 2019-06-07 | 2023-03-07 | Snap Inc. | Detection of a physical collision between two client devices in a location sharing system |
US11714535B2 (en) | 2019-07-11 | 2023-08-01 | Snap Inc. | Edge gesture interface with smart interactions |
US11821742B2 (en) | 2019-09-26 | 2023-11-21 | Snap Inc. | Travel based notifications |
US11218838B2 (en) | 2019-10-31 | 2022-01-04 | Snap Inc. | Focused map-based context information surfacing |
US11128715B1 (en) | 2019-12-30 | 2021-09-21 | Snap Inc. | Physical friend proximity in chat |
US11429618B2 (en) | 2019-12-30 | 2022-08-30 | Snap Inc. | Surfacing augmented reality objects |
US11729343B2 (en) | 2019-12-30 | 2023-08-15 | Snap Inc. | Including video feed in message thread |
US11343323B2 (en) | 2019-12-31 | 2022-05-24 | Snap Inc. | Augmented reality objects registry |
US11943303B2 (en) | 2019-12-31 | 2024-03-26 | Snap Inc. | Augmented reality objects registry |
US11893208B2 (en) | 2019-12-31 | 2024-02-06 | Snap Inc. | Combined map icon with action indicator |
US11888803B2 (en) | 2020-02-12 | 2024-01-30 | Snap Inc. | Multiple gateway message exchange |
US11228551B1 (en) | 2020-02-12 | 2022-01-18 | Snap Inc. | Multiple gateway message exchange |
US11765117B2 (en) | 2020-03-05 | 2023-09-19 | Snap Inc. | Storing data based on device location |
US11516167B2 (en) | 2020-03-05 | 2022-11-29 | Snap Inc. | Storing data based on device location |
US11619501B2 (en) | 2020-03-11 | 2023-04-04 | Snap Inc. | Avatar based on trip |
US11915400B2 (en) | 2020-03-27 | 2024-02-27 | Snap Inc. | Location mapping for large scale augmented-reality |
US11776256B2 (en) | 2020-03-27 | 2023-10-03 | Snap Inc. | Shared augmented reality system |
US11430091B2 (en) | 2020-03-27 | 2022-08-30 | Snap Inc. | Location mapping for large scale augmented-reality |
US11483267B2 (en) | 2020-06-15 | 2022-10-25 | Snap Inc. | Location sharing using different rate-limited links |
US11290851B2 (en) | 2020-06-15 | 2022-03-29 | Snap Inc. | Location sharing using offline and online objects |
US11314776B2 (en) | 2020-06-15 | 2022-04-26 | Snap Inc. | Location sharing using friend list versions |
US11503432B2 (en) | 2020-06-15 | 2022-11-15 | Snap Inc. | Scalable real-time location sharing framework |
US11816194B2 (en) | 2020-06-21 | 2023-11-14 | Apple Inc. | User interfaces for managing secure operations |
US11676378B2 (en) | 2020-06-29 | 2023-06-13 | Snap Inc. | Providing travel-based augmented reality content with a captured image |
US11943192B2 (en) | 2020-08-31 | 2024-03-26 | Snap Inc. | Co-location connection service |
US11606756B2 (en) | 2021-03-29 | 2023-03-14 | Snap Inc. | Scheduling requests for location data |
US11601888B2 (en) | 2021-03-29 | 2023-03-07 | Snap Inc. | Determining location using multi-source geolocation data |
US11902902B2 (en) | 2021-03-29 | 2024-02-13 | Snap Inc. | Scheduling requests for location data |
US11645324B2 (en) | 2021-03-31 | 2023-05-09 | Snap Inc. | Location-based timeline media content system |
US11829834B2 (en) | 2021-10-29 | 2023-11-28 | Snap Inc. | Extended QR code |
US11956533B2 (en) | 2021-11-29 | 2024-04-09 | Snap Inc. | Accessing media at a geographic location |
US11954314B2 (en) | 2022-09-09 | 2024-04-09 | Snap Inc. | Custom media overlay system |
Also Published As
Publication number | Publication date |
---|---|
US9778706B2 (en) | 2017-10-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9778706B2 (en) | Peekable user interface on a portable electronic device | |
US10042534B2 (en) | Mobile terminal and method to change display screen | |
CN107479737B (en) | Portable electronic device and control method thereof | |
US8443199B2 (en) | Mobile terminal and method of controlling the mobile terminal | |
US8413075B2 (en) | Gesture movies | |
US8224392B2 (en) | Mobile terminal capable of recognizing fingernail touch and method of controlling the operation thereof | |
US9632578B2 (en) | Method and device for switching tasks | |
US20130222272A1 (en) | Touch-sensitive navigation in a tab-based application interface | |
US20130167093A1 (en) | Display apparatus for releasing locked state and method thereof | |
US9746937B2 (en) | Method and apparatus for movable assembly position sensing and virtual keyboard display | |
US20140337720A1 (en) | Apparatus and method of executing function related to user input on screen | |
CA2806801C (en) | Peekable user interface on a portable electronic device | |
US20140003653A1 (en) | System and Method for Detemining the Position of an Object Displaying Media Content | |
EP2680106A1 (en) | System and method for determining the position of an object displaying media content |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: RESEARCH IN MOTION LIMITED, CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MCKENZIE, DONALD SOMERSET;REEL/FRAME:027758/0084 Effective date: 20120221 |
|
AS | Assignment |
Owner name: BLACKBERRY LIMITED, ONTARIO Free format text: CHANGE OF NAME;ASSIGNOR:RESEARCH IN MOTION LIMITED;REEL/FRAME:034161/0093 Effective date: 20130709 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
CC | Certificate of correction | ||
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
AS | Assignment |
Owner name: MALIKIE INNOVATIONS LIMITED, IRELAND Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BLACKBERRY LIMITED;REEL/FRAME:064104/0103 Effective date: 20230511 |
|
AS | Assignment |
Owner name: MALIKIE INNOVATIONS LIMITED, IRELAND Free format text: NUNC PRO TUNC ASSIGNMENT;ASSIGNOR:BLACKBERRY LIMITED;REEL/FRAME:064271/0199 Effective date: 20230511 |