US8471914B2 - Image processing system with ease of operation - Google Patents

Image processing system with ease of operation Download PDF

Info

Publication number
US8471914B2
US8471914B2 US13/296,966 US201113296966A US8471914B2 US 8471914 B2 US8471914 B2 US 8471914B2 US 201113296966 A US201113296966 A US 201113296966A US 8471914 B2 US8471914 B2 US 8471914B2
Authority
US
United States
Prior art keywords
image processing
information
equipment
portable terminal
processing apparatus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US13/296,966
Other versions
US20120120259A1 (en
Inventor
Daisuke Sakiyama
Takeshi Morikawa
Takeshi Minami
Kaitaku Ozawa
Kazuya ANEZAKI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Konica Minolta Business Technologies Inc
Original Assignee
Konica Minolta Business Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Konica Minolta Business Technologies Inc filed Critical Konica Minolta Business Technologies Inc
Assigned to KONICA MINOLTA BUSINESS TECHNOLOGIES, INC. reassignment KONICA MINOLTA BUSINESS TECHNOLOGIES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ANEZAKI, KAZUYA, MINAMI, TAKESHI, MORIKAWA, TAKESHI, OZAWA, KAITAKU, SAKIYAMA, DAISUKE
Publication of US20120120259A1 publication Critical patent/US20120120259A1/en
Application granted granted Critical
Publication of US8471914B2 publication Critical patent/US8471914B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1202Dedicated interfaces to print systems specifically adapted to achieve a particular effect
    • G06F3/1203Improving or facilitating administration, e.g. print management
    • G06F3/1204Improving or facilitating administration, e.g. print management resulting in reduced user or operator actions, e.g. presetting, automatic actions, using hardware token storing data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1223Dedicated interfaces to print systems specifically adapted to use a particular technique
    • G06F3/1224Client or server resources management
    • G06F3/1226Discovery of devices having required properties
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1223Dedicated interfaces to print systems specifically adapted to use a particular technique
    • G06F3/1237Print job management
    • G06F3/1267Job repository, e.g. non-scheduled jobs, delay printing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1223Dedicated interfaces to print systems specifically adapted to use a particular technique
    • G06F3/1237Print job management
    • G06F3/1273Print job history, e.g. logging, accounting, tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1278Dedicated interfaces to print systems specifically adapted to adopt a particular infrastructure
    • G06F3/1285Remote printer device, e.g. being remote from client or server
    • G06F3/1289Remote printer device, e.g. being remote from client or server in server-client-printer device configuration, e.g. the server does not see the printer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1278Dedicated interfaces to print systems specifically adapted to adopt a particular infrastructure
    • G06F3/1292Mobile client, e.g. wireless printing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00323Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a measuring, monitoring or signaling apparatus, e.g. for transmitting measured information to a central location
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00344Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a management, maintenance, service or repair apparatus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3202Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of communication or activity log or report
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3225Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
    • H04N2201/3252Image capture parameters, e.g. resolution, illumination conditions, orientation of the image capture device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3225Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
    • H04N2201/3253Position information, e.g. geographical position at time of capture, GPS data

Definitions

  • the present invention relates to an image processing system, a control method, and the portable terminal, and more particularly to an image processing system in which an image processing apparatus is operated by a portable terminal, a control method, and the portable terminal.
  • Image processing apparatuses such as copiers, printers, and MFPs (Multi-Functional Peripherals) including those functions are often used in office environments. Many users have opportunities to use those apparatuses.
  • portable information terminals obtain information stored beforehand in association with positional information from a server and combine the information with the captured image, thereby allowing users to view information such as facility usage status or reservation information based on the captured image.
  • Japanese Laid-Open Patent Publication No. 2003-022229 previously filed by the applicant of the present application, discloses an information processing system to facilitate operations of an image processing apparatus, in which information indicating an operation location of a user and an operation content are stored as history in association with information of an operation target, whereby the information stored in association with the operation location and the operation content is extracted and presented when an operation input at a certain location is accepted.
  • An object of the present invention is to provide an image processing system in which an image processing apparatus can be operated easily using a portable terminal, a control method, and a portable terminal.
  • an image processing system includes a portable terminal, equipment, and an information processing apparatus. At least one of the equipment is an image processing apparatus including a controller.
  • the portable terminal includes a shooting unit, an obtaining unit for obtaining positional information and orientation information of the portable terminal, a display unit, and an input unit for inputting an instruction on an operation screen displayed on the display unit.
  • the information processing apparatus includes a storage unit for storing, as information about the equipment, positional information of the equipment and communication information for communicating with the equipment. The portable terminal transmits positional information and orientation information at a time of shooting by the shooting unit to the information processing apparatus.
  • the information processing apparatus detects one of the equipment that is included in an image shot by the shooting unit of the portable terminal, based on the positional information and orientation information at a time of shooting at the portable terminal, and transmits information about the detected equipment to the portable terminal.
  • the portable terminal further includes a controller for executing a process of obtaining an operation history on the image processing apparatus, based on the received information about the equipment, allowing the display unit to display an operation screen presenting the operation history in a selectable manner, and when accepting a selection of the operation history at the input unit, transmitting a control signal for allowing the image processing apparatus to execute image processing specified by the selected operation history, to the image processing apparatus.
  • the information processing apparatus further stores an operation history as the information about the equipment, and the controller of the portable terminal allows the display unit to display the operation screen using the operation history included in the received information about the equipment.
  • the controller of the portable terminal requests an operation history from the equipment, based on the communication information for communicating with the equipment that is included in the received information about the equipment, and obtains the operation history received from the equipment.
  • the portable terminal further includes a storage unit for storing positional information and orientation information at a time when the equipment is shot, in association with information specifying the equipment.
  • the portable terminal transmits positional information and orientation information at a time when the selected equipment is shot, to the information processing apparatus.
  • the equipment includes a controller for controlling the image processing apparatus.
  • the operation history includes information specifying an image processing apparatus that has executed image processing designated by the operation, a function of the image processing apparatus that is necessary for the image processing, and image data subjected to the image processing.
  • the controller of the portable terminal sorts the operation history so as to be displayed for each image processing apparatus that has executed image processing designated by the operation history.
  • the controller of the portable terminal sorts the operation history so as to be displayed for each function of the image processing apparatus that is necessary for image processing designated by the operation history.
  • the operation history includes information specifying an image processing apparatus that has executed image processing designated by the operation, a function of the image processing apparatus that is necessary for the image processing, image data subjected to the image processing, and a storage location of image data obtained by the image processing.
  • the controller of the portable terminal sorts the operation history so as to be displayed for each storage location of image data obtained by image processing designated by the operation history.
  • the controller of the portable terminal allows the display unit to display the operation history in a selectable and changeable manner on the operation screen, and when accepting an instruction to change the operation history on the operation screen, transmits a control signal for allowing the image processing apparatus to execute image processing specified by the changed operation history, to the image processing apparatus.
  • the image processing system includes a portable terminal having a shooting unit and a display unit, equipment, at least one of which is an image processing apparatus, and an information processing apparatus.
  • the information processing apparatus stores, as information about the equipment, positional information of the equipment and communication information for communicating with the equipment.
  • the control method includes the steps of: causing the portable terminal to transmit positional information and orientation information at a time of shooting by the shooting unit of the portable terminal to the information processing apparatus; causing the information processing apparatus to detect one of the equipment that is included in an image shot by the shooting unit of the portable terminal, based on the positional information and orientation information received from the portable terminal and the positional information included in the information about the equipment, and to transmit the information about the detected equipment to the portable terminal; causing the portable terminal to obtain an operation history on the image processing apparatus in the equipment, based on the information about the equipment transmitted from the information processing apparatus, and to allow the display unit to display an operation screen presenting the operation history in a selectable manner; when accepting a selection of the operation history, causing the portable terminal to transmit a control signal for allowing the image processing apparatus to execute image processing specified by the selected operation history, to the image processing apparatus; and executing corresponding image processing based on the signal in the image processing apparatus.
  • a portable terminal includes a shooting unit, an obtaining unit for obtaining positional information and orientation information of the portable terminal, a display unit, an input unit for inputting an instruction on an operation screen displayed on the display unit, and a controller.
  • the controller executes a process of transmitting positional information and orientation information at a time of shooting by the shooting unit to an information processing apparatus, a process of obtaining an operation history on an image processing apparatus in equipment based on information about equipment that is received from the information processing apparatus, and allowing the display unit to display an operation screen presenting the operation history in a selectable manner, and a process of accepting a selection of the operation history at the input unit and then transmitting a control signal for allowing the image processing apparatus to execute image processing specified by the selected operation history, to the image processing apparatus.
  • the controller further executes a process of requesting the operation history from the equipment based on communication information for communicating with the equipment that is included in the received information about the equipment, in the process of allowing the display unit to display an operation screen, and a process of receiving the operation history from the equipment.
  • a non-transitory computer-readable recording medium is encoded with a control program for causing a portable terminal to execute processing.
  • the portable terminal includes a shooting unit and a display unit.
  • the control program causes the portable terminal to execute the steps of: transmitting positional information and orientation information at a time of shooting by the shooting unit to an information processing apparatus; obtaining an operation history on an image processing apparatus in equipment based on information about equipment that is received from the information processing apparatus, and displaying an operation screen presenting the operation history in a selectable manner on the display unit; and accepting a selection of the operation history and then transmitting a control signal for allowing the image processing apparatus to execute image processing specified by the selected operation history, to the image processing apparatus.
  • the step of the control program of displaying an operation screen on the display unit includes the steps of: requesting the operation history from the equipment based on communication information for communicating with the equipment that is included in the received information about the equipment; and receiving the operation history from the equipment.
  • FIG. 1 is a diagram showing a specific example of a configuration of an image processing system according to an embodiment.
  • FIG. 2 is a diagram showing a specific example of a hardware configuration of an MFP (Multi-Functional Peripheral) included in the image processing system according to the embodiment.
  • MFP Multi-Functional Peripheral
  • FIG. 3 is a diagram showing a specific example of a hardware configuration of a PC included in the image processing system according to the embodiment.
  • FIG. 4 is a diagram showing a specific example of a hardware configuration of a portable terminal included in the image processing system according to the embodiment.
  • FIG. 5 is a diagram showing a specific example of a hardware configuration of a server included in the image processing system according to the embodiment.
  • FIG. 6A and FIG. 6B are diagrams showing a specific example of equipment information.
  • FIG. 7 is a block diagram showing a specific example of a functional configuration of the portable terminal.
  • FIG. 8 is a block diagram showing a specific example of a functional configuration of the server.
  • FIG. 9 is a sequence diagram depicting a flow of an operation for operating the MFP in an operation flow 1 .
  • FIG. 10 is a flowchart showing a specific example of an operation in the portable terminal performing an operation for operating the MFP.
  • FIG. 11 is a diagram showing a specific example of an operation screen appearing on an operation panel of the portable terminal through the process in step S 113 in FIG. 10 .
  • FIG. 12 is a diagram showing another specific example of the operation screen.
  • FIG. 13 is a sequence diagram depicting a flow of an operation for operating the MFP in an operation flow 2 .
  • FIG. 14 is a diagram showing a specific example of a select screen for selecting a device for which operation history is requested.
  • FIG. 15 is a diagram showing a specific example of a functional configuration of the portable terminal according to a modified embodiment.
  • FIG. 1 is a diagram showing a specific example of a configuration of an image processing system according to an embodiment.
  • the image processing system includes an MFP (Multi-Functional Peripheral) 100 serving as an image processing apparatus, a personal computer (hereinafter referred to as PC) 200 serving as a control device for controlling MFP 100 , a server 300 , and a portable terminal 400 .
  • MFP Multi-Functional Peripheral
  • PC personal computer
  • the image processing apparatus is not limited to an MFP and may be a printer, a facsimile machine, a copier, or any other similar device having at least one or more image processing functions.
  • MFP 100 is an image processing apparatus that combines these functions.
  • the image processing system may include a plurality of MFPs 100 A, 100 B, which are collectively referred to as MFP 100 .
  • the information processing apparatus is not limited to a PC and may be any other device as long as it stores a program for controlling MFP 100 , such as a printer driver, and includes a CPU (Central Processing Unit) 20 ( FIG. 3 ) which executes the program to output an operation signal to MFP 100 .
  • Any other device may be, for example, a mobile phone or a portable document reader, and may be combined with portable terminal 400 described later.
  • the image processing system may include a plurality of PCs 200 A, 200 B, which are collectively referred to as PC 200 .
  • Portable terminal 400 is, for example, a mobile phone or any other device at least having a camera function, an instruction input function, and a communication function.
  • Server 300 is a general personal computer or any other similar device.
  • MFP 100 , PC 200 , and server 300 are connected to a wired or wireless network such as a LAN.
  • Portable terminal 400 can also connect to the network, so that portable terminal 400 can communicate with each of MFP 100 , PC 200 , and server 300 via the network.
  • MFP 100 , server 300 , and PC 200 each have a function for performing wireless communication.
  • Portable terminal 400 can wirelessly communicate with each of MFP 100 , PC 200 , and server 300 .
  • communication using Bluetooth® or infrared communication can be used for wireless communication.
  • server 300 stores information of each of MFP 100 and PC 200 included in the image processing system, as “equipment information.”
  • the equipment information includes an operation history in each device. Specifically, in the case where the equipment is MFP 100 , the operation history in each MFP 100 is included. In the case where the equipment is PC 200 , the operation history about MFP 100 in each PC 200 is included.
  • the equipment information further includes positional information which is information representing the location of each device, and an IP address which is information for communication via LAN.
  • the operation history is transmitted to server 300 by MFP 100 and PC 200 and then registered in server 300 .
  • the operation history may be registered, by way of example, in such a manner that every time each device accepts an operation input and generates a control signal in accordance with the operation input, history information representing the control signal is generated and transmitted to server 300 .
  • each device may accumulate the history information in a predetermined memory and transmit the history information to server 300 at a preset timing, for example, at a timing as requested by server 300 , at predetermined time intervals, at a timing when a predetermined amount of data is accumulated, or at a timing when the device connects to the LAN.
  • the history information in each device may be deleted from the storage area upon being transmitted to server 300 or may be transmitted to server 300 while being left in the storage area.
  • the operation history at least includes information specifying MFP 100 that has executed the image processing, information (function name) specifying the function of MFP 100 that is used in the image processing, and data (document name) subjected to the image processing.
  • the operation history may additionally include information specifying a device (target PC), which is a storage location of image data obtained as a result of the image processing, and may include information specifying the user who has performed the operation, as described later.
  • target PC is a storage location of image data obtained as a result of the image processing
  • the storage location of image data obtained as a result of the image processing refers to the storage area or a device having the storage area.
  • the storage location refers to the device to which the image data is transmitted.
  • the device serving as a storage location of image data may be a PC or MFP 100 itself.
  • the positional information is, for example, a combination of latitude, longitude and altitude, or the closest access point of the LAN.
  • the positional information may be registered in server 300 , for example, by an administrator when each device is installed, or may be registered in each device and transmitted to server 300 at a predetermined timing to be registered in server 300 .
  • each device may obtain its own positional information at a predetermined timing using that function and may transmit the positional information to server 300 for registration in server 300 .
  • the information for communication may be, for example, an IP address.
  • the communication information may be a Bluetooth® address.
  • the communication information may be an address corresponding to each kind of communication. In the following description, it is assumed that each device included in the image processing system performs communication via the LAN and stores an IP address as the communication information.
  • the communication information may be registered in server 300 , for example, by an administrator when each device is installed, or may be registered in each device and transmitted to server 300 at a predetermined timing to be registered in server 300 .
  • the communication information may be transmitted to server 300 by each device and registered in server 300 at the timing of being assigned or at a predetermined later timing.
  • the user who carries portable terminal 400 points portable terminal 400 at MFP 100 or PC 200 to shoot a photo, so that the equipment information of the shot device is transmitted from server 300 to portable terminal 400 .
  • An operation panel 45 ( FIG. 4 ) of portable terminal 400 displays an operation screen which allows a selection of operation history included in the equipment information.
  • the user selects the operation history that represents the operation that the user wants to have MFP 100 to execute. Accordingly, a control signal for executing the same image processing as the image processing represented by the operation history is transmitted from portable terminal 400 to MFP 100 , so that MFP 100 executes the image processing.
  • FIG. 2 shows a specific example of a hardware configuration of MFP 100 .
  • MFP 100 includes a CPU (Central Processing Unit) 10 as an arithmetic unit for controlling MFP 100 as a whole, a ROM (Read Only Memory) 11 for storing a program executed in CPU 10 , a RAM (Random Access Memory) 12 functioning as a work area for executing a program in CPU 10 , a scanner 13 for optically scanning a document placed on a not-shown platen to obtain image data, a printer 14 for fixing image data on print paper, an operation panel 15 including a touch panel for displaying information and accepting an operation input to MFP 100 , a memory 16 for storing image data, and a network controller 17 for controlling communication via the LAN.
  • CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • FIG. 3 shows a specific example of a hardware configuration of PC 200 .
  • PC 200 includes a CPU 20 as an arithmetic unit for controlling PC 200 as a whole, a ROM 21 for storing a program executed in CPU 20 , a RAM 22 functioning as a work area for executing a program in CPU 20 , an operation unit 25 for performing an operation input, a memory 26 for storing a variety of information and operation history, and a network controller 27 for controlling communication via the LAN.
  • a CPU 20 as an arithmetic unit for controlling PC 200 as a whole
  • ROM 21 for storing a program executed in CPU 20
  • RAM 22 functioning as a work area for executing a program in CPU 20
  • an operation unit 25 for performing an operation input
  • a memory 26 for storing a variety of information and operation history
  • a network controller 27 for controlling communication via the LAN.
  • the hardware configuration shown in FIG. 3 is a hardware configuration of a general personal computer, and the hardware configuration of PC 200 is not limited to the one shown in FIG. 3 . Specifically, any other configuration may be included, or any other configuration may be included in place of the configuration shown in FIG. 3 .
  • FIG. 4 shows a specific example of a hardware configuration of portable terminal 400 .
  • portable terminal 400 includes a CPU 40 as an arithmetic unit for controlling portable terminal 400 as a whole, a ROM 41 for storing a program executed in CPU 40 , a RAM 42 functioning as a work area for executing a program in CPU 40 , an electronic compass 43 including a magnetic sensor for detecting an orientation of portable terminal 400 , a GPS (Global Positioning System) controller 44 receiving a GPS signal or a positional signal from a base station for obtaining positional information of portable terminal 400 , an operation panel 45 including a touch panel for displaying information and accepting an operation input to portable terminal 400 , a camera 46 , and a network controller 47 for controlling communication via the LAN.
  • a CPU 40 as an arithmetic unit for controlling portable terminal 400 as a whole
  • a ROM 41 for storing a program executed in CPU 40
  • a RAM 42 functioning as a work area for executing a program in CPU 40
  • an electronic compass 43 including a magnetic sensor for detecting an orientation of portable terminal 400
  • Operation panel 45 may be configured similar to operation panel 15 of MFP 100 . More specifically, it includes, for example, a touch panel formed of a display such as a liquid crystal display and a position designating device such as an optical touch panel or a capacitive touch panel, and operation keys.
  • a touch panel formed of a display such as a liquid crystal display and a position designating device such as an optical touch panel or a capacitive touch panel, and operation keys.
  • CPU 40 allows the touch panel to display an operation screen based on the operation history included in the equipment information transmitted from server 300 as described later.
  • CPU 40 specifies the designated position on the touch panel, generates a control signal for allowing MFP 100 to execute image processing based on screen data of the operation screen and the specified position, and transmits the control signal to MFP 100 .
  • Electronic compass 43 and GPS controller 44 output a signal to CPU 40 to indicate the obtained orientation or positional information of portable terminal 400 .
  • the hardware configuration shown in FIG. 4 is a hardware configuration necessary for portable terminal 400 to execute the operation illustrated in the operation overview above, and portable terminal 400 is not limited to the one only including this hardware configuration.
  • a speaker, a microphone, and a communication controller for communicating with a base station may be included in a case where portable terminal 400 has a call function.
  • FIG. 5 shows a specific example of a hardware configuration of server 300 .
  • server 300 is formed, for example, of a general computer as described above.
  • server 300 includes a CPU 30 as an arithmetic unit for controlling server 300 as a whole, a ROM 31 for storing a program executed in CPU 30 , a RAM 32 functioning as a work area for executing a program in CPU 30 , an HD (Hard Disk) 33 for storing the equipment information and the like, and a network controller 34 for controlling communication via the LAN.
  • a CPU 30 as an arithmetic unit for controlling server 300 as a whole
  • ROM 31 for storing a program executed in CPU 30
  • RAM 32 functioning as a work area for executing a program in CPU 30
  • an HD (Hard Disk) 33 for storing the equipment information and the like
  • network controller 34 for controlling communication via the LAN.
  • FIG. 6A and FIG. 6B show a specific example of the equipment information stored in HD 33 .
  • FIG. 6A illustrates the equipment information of MFP 100
  • FIG. 6B illustrates the equipment information of PC 200 .
  • equipment information of MFP 100 stored as the equipment information of MFP 100 are information (equipment name) specifying the MFP, positional information, an IP address as communication information, and the operation history.
  • equipment information of PC 200 stored as the equipment information of PC 200 are information (equipment name) for specifying the PC, positional information, an IP address as communication information, and the operation history of MFP 100 .
  • FIG. 7 is a block diagram showing a specific example of a functional configuration of portable terminal 400 .
  • Each function shown in FIG. 7 is a function mainly formed in CPU 40 when CPU 40 reads out a program stored in ROM 41 and executes the program on RAM 42 .
  • at least part of the functions may be formed by the hardware configuration shown in FIG. 4 .
  • portable terminal 400 includes an instruction input unit 401 for accepting an instruction input from operation panel 45 , a position obtaining unit 402 for obtaining positional information of portable terminal 400 in response to a shooting instruction from operation panel 45 ; an orientation obtaining unit 403 for obtaining an orientation of portable terminal 400 in response to a shooting instruction from operation panel 45 , an image obtaining unit 404 for obtaining image data captured by shooting by camera 46 in response to a shooting instruction from operation panel 45 , a server request unit 405 for requesting the equipment information from server 300 together with the positional information and orientation information obtained in response to a shooting instruction from operation panel 45 , an information obtaining unit 406 for obtaining the equipment information from server 300 in response to the request, a screen generation unit 407 for generating screen data for allowing operation panel 45 to display an operation screen based on the obtained equipment information, a command generation unit 408 for generating a control signal, and a transmission unit 409 for transmitting the generated control signal to MFP
  • position obtaining unit 402 and orientation obtaining unit 403 obtain positional information and orientation information, respectively, in response to a shooting instruction from operation panel 45 , the information thereof can serve as information of a shooting position and information of a shooting direction, respectively. Then, in the description below, the positional information and orientation information transmitted from portable terminal 400 to server 300 in accordance with the program are also referred to as shooting position information and shooting direction information, respectively.
  • Screen generation unit 407 generates screen data for allowing operation panel 45 to display an operation screen which allows a selection of operation history, by referring to the operation history included in the equipment information.
  • CPU 40 performs a display process for allowing operation panel 45 to display an operation screen based on the screen data, whereby the operation screen appears on operation panel 45 .
  • Instruction input unit 401 also accepts an instruction input from operation panel 45 , which specifies a position on operation panel 45 displaying the operation screen. A signal specifying the position represented by the instruction input is input to command generation unit 408 .
  • Command generation unit 408 specifies the designated position on the operation screen, based on the signal and the screen data.
  • Command generation unit 408 stores the correspondence between a position on the screen data and a position on operation panel 45 beforehand and specifies the operation history corresponding to the position based on the correspondence. Then, a control signal for allowing MFP 100 to execute image processing represented by the operation history is generated. The generated control signal is input to transmission unit 409 .
  • Transmission unit 409 performs a process for transmitting the generated control signal to the IP address, by referring to the IP address included in the equipment information.
  • FIG. 8 is a block diagram showing a specific example of a functional configuration of server 300 .
  • Each function shown in FIG. 8 is a function mainly formed in CPU 30 when CPU 30 reads out a program stored in ROM 31 and executes the program on RAM 32 .
  • at least part of the functions may be formed by the hardware configuration shown in FIG. 5 .
  • server 300 includes an equipment information input unit 301 for accepting input of positional information, history information, and address information from each equipment, an equipment information storage unit 302 for storing or updating the equipment information input into a predetermined area in HD 33 , a portable information input unit 303 for accepting input of a shooting position and a shooting direction together with a request for equipment information from portable terminal 400 , a search unit 304 for searching for the equipment present in the shot image by portable terminal 400 based on the shooting position and the shooting direction of portable terminal 400 , and a transmission unit 305 for transmitting the equipment information about the found equipment to portable terminal 400 .
  • an equipment information input unit 301 for accepting input of positional information, history information, and address information from each equipment
  • an equipment information storage unit 302 for storing or updating the equipment information input into a predetermined area in HD 33
  • portable information input unit 303 for accepting input of a shooting position and a shooting direction together with a request for equipment information from portable terminal 400
  • a search unit 304 for searching for the equipment present in the shot image by
  • FIG. 9 is a sequence diagram illustrating a flow of an operation for operating the MFP in operation flow 1 .
  • FIG. 9 shows a flow of processing in MFP 100 on the left side, a flow of processing in portable terminal 400 at the middle, and a flow of processing in server 300 on the right side.
  • Each operation is implemented when the CPU of each device reads out a program stored in the ROM and executes the program on the RAM.
  • a process of transmitting an operation history from MFP 100 to server 300 is performed, and the operation history is registered as the equipment information.
  • the operation history concerning the image processing is transmitted to server 300 at a predetermined timing (#01-1).
  • server 300 receiving the operation history a process for registering the received operation history in the equipment information of MFP 100 is executed (#03).
  • the operation history of MFP 100 is stored as the equipment information in server 300 .
  • step S 1 As an operation for operating the MFP, in a state in which application for operating the MFP is being activated in portable terminal 400 (step S 1 ), the camera shoots a photo (step S 3 ). Thereafter, upon input of an instruction for operating the MFP (step S 5 ), the information that specifies a shooting position and a shooting direction at portable terminal 400 is transmitted to server 300 , whereby the corresponding equipment information is requested (step S 5 - 1 ).
  • Server 300 accepts the request from portable terminal 400 , specifies the equipment located in a prescribed range in the shooting direction from the shooting position of portable terminal 400 , by referring to each positional information in the stored equipment information, and searches for the equipment information about the specified equipment (step S 7 ). Then, the corresponding equipment information is transmitted to portable terminal 400 (step S 7 - 1 ).
  • the equipment information at least the operation history and communication information are transmitted.
  • the application causes an operation screen to appear to present the operation history included in the equipment information in a selectable manner (step S 9 ).
  • step S 11 When an operation history is selected (touched) on the operation screen appearing on operation panel 45 of portable terminal 400 (step S 11 ), a control signal is generated for allowing MFP 100 to execute image processing indicated by the selected operation history (step S 13 ). Then, the control signal is transmitted to MFP 100 (step S 13 - 1 ).
  • the designated image processing is executed in accordance with the control signal (step S 15 ).
  • the operation history may be associated with information (for example, a user ID) specifying the user who performs the operation, as described above.
  • Server 300 may store the correspondence between information specifying portable terminal 400 and a user ID beforehand.
  • server 300 may specify the user ID related to portable terminal 400 and search for the equipment information which is about the equipment located within a predetermined range in the shooting direction from the shooting position of portable terminal 400 and which includes the operation history associated with the user ID. Then, the found equipment information may be transmitted in step S 7 - 1 .
  • step S 9 an operation screen appears which allows the user related to portable terminal 400 to select an operation history. Therefore, in the case where the same operation as the operation performed before is repeated, an operation screen more easy to operate is displayed.
  • the equipment information of MFP 100 is transmitted as equipment information from server 300 .
  • This is an example of the operation in the case where MFP 100 is shot by portable terminal 400 .
  • the operation history on MFP 100 in PC 200 is received from server 300 .
  • PC 200 generates and transmits a command for executing image processing to MFP 100 according to a user's instruction
  • PC 200 transmits the operation history concerning the command to server 300 at a predetermined timing. Accordingly, the operation history on MFP 100 is registered as the equipment information of PC 200 in server 300 .
  • FIG. 10 is a flowchart illustrating a specific example of an operation in portable terminal 400 performing an operation for operating the MFP.
  • the operation shown in the flowchart in FIG. 10 is implemented when CPU 40 reads out a program stored in ROM 41 corresponding to the application for operating the MFP and executes the read program on RAM 42 .
  • step S 101 in a state in which CPU 40 is executing the application for operating the MFP (YES in step S 101 ), and if an instruction for operating the MFP is input from operation panel 45 (YES in step S 103 ), then, in step S 105 , CPU 40 transmits information representing a shooting position and a shooting direction to server 300 and requests transmission of the equipment information of the corresponding equipment.
  • CPU 40 executes a process for displaying an operation screen presenting the operation history in a selectable manner on operation panel 45 , in step S 113 .
  • FIG. 11 is a diagram showing a specific example of the operation screen appearing on operation panel 45 through the process in step S 113 as described above.
  • FIG. 11 shows a specific example of the operation screen appearing when the equipment information of MFP 100 is received as equipment information from server 300 .
  • the operation history displayed as a choice in the operation screen includes, for each image processing performed in MFP 100 , information (function name) specifying the function of MFP 100 that is used in the image processing, data (document name) subjected to the image processing, and information specifying a device (target PC) which is a storage location of image data obtained as a result of the image processing.
  • the operation history shown as a choice in the operation screen is not necessarily displayed with all of the above-noted information and may be displayed with at least only one of them. Alternatively, only the information specifying the operation history itself may be displayed. In this case, when the operation history based on such information is selected, the next screen or a pop-up screen may appear to display the detailed contents of the selected operation history.
  • CPU 40 sorts the operation history so as to be displayed for each storage location of image data, and generates screen data for the operation screen.
  • CPU 40 sorts the operation history so as to be displayed for each function of the MFP that is necessary for the image processing designated by the operation history, and generates screen data for the operation screen.
  • FIG. 11 illustrates the operation screen based on the screen data generated by sorting the operation history in any of the foregoing manners. In this way, as shown in FIG. 11 , the operation history is displayed for each storage location of image data or for each function of the MFP, thereby allowing the user to easily find the desired operation history.
  • CPU 40 sorts the operation history so as to be displayed for each MFP that has executed image processing designated by the operation history, and generates screen data for the operation screen. In this manner, the operation history is displayed for each MFP, thereby allowing the user to easily find the desired operation history.
  • step S 117 CPU 40 Upon accepting an operation input on the operation screen (YES in step S 115 ), in step S 117 , CPU 40 generates a control signal for allowing MFP 100 to execute the image processing indicated by the operation history designated by the operation input.
  • step S 115 an operation to change the operation history may be accepted in place of an instruction to select. This is applicable to the example described below.
  • CPU 40 when generating screen data for displaying the operation screen as shown in FIG. 11 as the operation screen, CPU 40 generates screen data for displaying a function name, a document name, or a storage location of image data in a changeable manner in each operation history. For example, as shown in FIG. 12 , a pull-down button is displayed next to each of a function name, a document name, and a target PC. Pressing the button causes another function, another document, or another device to show up as a choice.
  • Other functions of MFP 100 , other document names, and other devices may be stored beforehand in portable terminal 400 , or may be obtained with reference to the function included in any other operation history, or may be included in the equipment information about MFP 100 that is transmitted from server 300 .
  • step S 117 CPU 40 generates a control signal for allowing MFP 100 to execute image processing indicated by the changed operation history, based on the changed operation history.
  • server 300 stores the positional information and communication information as the equipment information of each of MFP 100 and PC 200 , and does not have to store the operation history.
  • FIG. 13 is a sequence diagram illustrating a flow of an operation for operating the MFP in operation flow 2 .
  • FIG. 13 shows a flow of processing in MFP 100 on the left, a flow of processing in portable terminal 400 , second from left, a flow of processing in server 300 , third from left, and a flow of processing in PC 200 on the right.
  • Each operation is implemented when the CPU of each device reads out a program stored in the ROM and executes the program on the RAM.
  • the information specifying the shooting position and shooting direction at portable terminal 400 is transmitted to server 300 in step S 5 - 1 .
  • the equipment information about the corresponding device is searched for in server 300 in step S 7 , similarly as in operation flow 1 illustrated in FIG. 9 .
  • the found equipment information is transmitted from server 300 to portable terminal 400 in step S 7 - 1 ′.
  • the equipment information does not have to include the operation history as a precondition, and of the equipment information, at least the communication information is transmitted in step S 7 - 1 ′.
  • portable terminal 400 receiving the equipment information specifies the equipment from which the operation history is requested, based on the communication information included in the equipment information (step S 8 ).
  • the operation history is requested from the specified equipment (step S 8 - 1 ).
  • the operation history of MFP 100 is transmitted to portable terminal 400 (step S 8 - 2 ).
  • each device stores the correspondence between the information specifying portable terminal 400 and the user ID beforehand, so that each device can transmit the operation history associated with the user ID corresponding to portable terminal 400 that has requested the operation history, among the stored operation history, to portable terminal 400 , in step S 8 - 2 .
  • FIG. 13 shows an example in which the equipment information about PC 200 is transmitted from server 300 to portable terminal 40 , and the operation history of MFP 100 is requested by portable terminal 400 from PC 200 based on the equipment information.
  • FIG. 13 is an example of operation flow 2 , which is applicable to a case where the equipment information about MFP 100 is transmitted from server 300 to portable terminal 400 . More specifically, also in this case, the equipment information does not include the operation history as a precondition, and portable terminal 400 receiving the equipment information requests the operation history from MFP 100 based on the communication information included in the equipment information and obtains the operation history from MFP 100 responding to the request.
  • step S 9 The following operation after step S 9 is similar to operation flow 1 shown in FIG. 9 .
  • MFP 100 can be operated using portable terminal 400 as described in the operation overview.
  • the user can activate the dedicated application in portable terminal 400 familiar to them and points portable terminal 400 to shoot a device, so that the operation history on MFP 100 in the device is displayed in a selectable manner. Then, the same operation history as the desired operation is selected therefrom, thereby allowing MFP 100 to execute the image processing designated by the operation.
  • the user can easily perform an operation for executing image processing indicated by the operation history.
  • the user does not have to move there and can operate MFP 100 with portable terminal 400 the user carries.
  • server 300 stores the positional information of each device beforehand and specifies the device located within a shooting range obtained from the shooting position and shooting direction from portable terminal 400 .
  • image data obtained by shooting a device included in the image processing system with camera 46 is stored beforehand in portable terminal 400 , in association with the positional information and orientation information at the time of shooting.
  • a shot image is stored in portable terminal 400 in association with the shooting position and shooting direction.
  • FIG. 14 is a diagram showing a specific example of a select screen for selecting a device for which operation history is requested.
  • image data of each device shot before is stored in portable terminal 400 , and the application is activated to display the shot images in a selectable manner.
  • an icon for selecting each device may be displayed, or an entry field for designating one of the devices in text may be displayed. It is noted that in these cases, the shooting position and shooting direction of the corresponding device are associated and stored beforehand.
  • FIG. 15 is a diagram showing a specific example of a functional configuration of portable terminal 400 according to a modified embodiment.
  • Each function shown in FIG. 15 is also a function mainly formed in CPU 40 when CPU 40 reads out a program stored in ROM 41 and executes the program on RAM 42 .
  • at least part of the functions may be formed by the hardware configuration shown in FIG. 4 .
  • portable terminal 400 includes an equipment specifying unit 410 in place of position obtaining unit 402 , orientation obtaining unit 403 , and image obtaining unit 404 shown in FIG. 7 .
  • Equipment specifying unit 410 accepts input of a signal indicating an operation position on the select screen as shown in FIG. 14 at instruction input unit 401 from operation panel 45 , specifies the selected equipment based on the signal, and outputs the shooting position and shooting direction stored in association with the equipment to server request unit 405 .
  • Server request unit 405 of portable terminal 400 requests the equipment information from server 300 along with the input shooting position and shooting direction.
  • MFP 100 when image data of the equipment shot before is stored, MFP 100 can be operated using the shot image without shooting a photo again. Even when portable terminal 400 does not have a shooting function, if the information specifying each device is stored beforehand, MFP 100 can be operated using the stored information.
  • the present invention also provides a program for allowing each device included in the image processing system to execute the foregoing operation.
  • a program may be stored in a computer-readable recording medium accompanying a computer, such as a flexible disk, a CD-ROM (Compact Disk-Read Only Memory), a ROM, a RAM, and a memory card, and be provided as a program product.
  • the program may be stored in a recording medium such as a hard disk contained in a computer.
  • the program may be downloaded via a network.
  • the program in accordance with the present invention may allow the process to be executed by invoking necessary modules, among program modules provided as a part of Operating System (OS) of a computer, in a prescribed sequence at a prescribed timing.
  • OS Operating System
  • the modules are not included in the program itself and the process is executed in cooperation with OS.
  • the program that does not include such modules may also be included in the program in accordance with the present invention.
  • the program in accordance with the present invention may be embedded in a part of another program.
  • the modules included in another program are not included in the program itself, and the process is executed in cooperation with another program.
  • Such a program embedded in another program may also be included in the program in accordance with the present invention.
  • the provided program product is installed in a program storage unit such as a hard disk for execution. It is noted that the program product includes the program itself and a recording medium encoded with the program.

Abstract

When a shot is taken by a portable terminal, the position and orientation at the time of shooting is transmitted to an information processing apparatus. The information processing apparatus, which stores information including the position of equipment, detects the equipment included in the shot image by the portable terminal, based on the position and orientation of the portable terminal and the position of the equipment, and transmits information about the equipment to the portable terminal. The portable terminal obtains an operation history on an image processing apparatus in the equipment based on the received information, and displays an operation screen presenting the operation history in a selectable manner on a display unit. Then, when a selection of operation history is accepted, a control signal for allowing the image processing apparatus to execute image processing indicated by the operation history is transmitted to the image processing apparatus.

Description

This application is based on Japanese Patent Application No. 2010-254514 filed with the Japan Patent Office on Nov. 15, 2010, the entire content of which is hereby incorporated by reference.
BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to an image processing system, a control method, and the portable terminal, and more particularly to an image processing system in which an image processing apparatus is operated by a portable terminal, a control method, and the portable terminal.
2. Description of the Related Art
Image processing apparatuses such as copiers, printers, and MFPs (Multi-Functional Peripherals) including those functions are often used in office environments. Many users have opportunities to use those apparatuses.
On one hand, with the recent widespread use of portable terminals, users carry portable equipment such as mobile phones having a function of connecting to the Internet, a camera function, and a position detection function and are familiar themselves with using them. Then, as disclosed in, for example, Japanese Laid-Open Patent Publication Nos. 2006-351024 and 2006-091390, some portable information terminals obtain information stored beforehand in association with positional information from a server and combine the information with the captured image, thereby allowing users to view information such as facility usage status or reservation information based on the captured image.
On the other hand, Japanese Laid-Open Patent Publication No. 2003-022229, previously filed by the applicant of the present application, discloses an information processing system to facilitate operations of an image processing apparatus, in which information indicating an operation location of a user and an operation content are stored as history in association with information of an operation target, whereby the information stored in association with the operation location and the operation content is extracted and presented when an operation input at a certain location is accepted.
As such image processing apparatuses have grown more sophisticated, the operations for users to give operation instructions become complicated. Therefore, the users who use the apparatuses less frequently or the users who use different kinds of apparatuses find it difficult to recognize available functions or find it difficult to use the apparatuses due to the complicated operations.
Then, in light of the widespread use of portable terminals as described above, users may desire to use portable terminals familiar to them even when operating image processing apparatuses.
SUMMARY OF THE INVENTION
The present invention is made in view of the foregoing problem. An object of the present invention is to provide an image processing system in which an image processing apparatus can be operated easily using a portable terminal, a control method, and a portable terminal.
In order to achieve the object, in accordance with an aspect of the present invention, an image processing system includes a portable terminal, equipment, and an information processing apparatus. At least one of the equipment is an image processing apparatus including a controller. The portable terminal includes a shooting unit, an obtaining unit for obtaining positional information and orientation information of the portable terminal, a display unit, and an input unit for inputting an instruction on an operation screen displayed on the display unit. The information processing apparatus includes a storage unit for storing, as information about the equipment, positional information of the equipment and communication information for communicating with the equipment. The portable terminal transmits positional information and orientation information at a time of shooting by the shooting unit to the information processing apparatus. The information processing apparatus detects one of the equipment that is included in an image shot by the shooting unit of the portable terminal, based on the positional information and orientation information at a time of shooting at the portable terminal, and transmits information about the detected equipment to the portable terminal. The portable terminal further includes a controller for executing a process of obtaining an operation history on the image processing apparatus, based on the received information about the equipment, allowing the display unit to display an operation screen presenting the operation history in a selectable manner, and when accepting a selection of the operation history at the input unit, transmitting a control signal for allowing the image processing apparatus to execute image processing specified by the selected operation history, to the image processing apparatus.
Preferably, the information processing apparatus further stores an operation history as the information about the equipment, and the controller of the portable terminal allows the display unit to display the operation screen using the operation history included in the received information about the equipment.
Preferably, the controller of the portable terminal requests an operation history from the equipment, based on the communication information for communicating with the equipment that is included in the received information about the equipment, and obtains the operation history received from the equipment.
Preferably, the portable terminal further includes a storage unit for storing positional information and orientation information at a time when the equipment is shot, in association with information specifying the equipment. When accepting a selection of the equipment, the portable terminal transmits positional information and orientation information at a time when the selected equipment is shot, to the information processing apparatus.
Preferably, the equipment includes a controller for controlling the image processing apparatus.
Preferably, the operation history includes information specifying an image processing apparatus that has executed image processing designated by the operation, a function of the image processing apparatus that is necessary for the image processing, and image data subjected to the image processing.
More preferably, when allowing the display unit to display the operation screen, the controller of the portable terminal sorts the operation history so as to be displayed for each image processing apparatus that has executed image processing designated by the operation history.
Preferably, when allowing the display unit to display the operation screen, the controller of the portable terminal sorts the operation history so as to be displayed for each function of the image processing apparatus that is necessary for image processing designated by the operation history.
Preferably, the operation history includes information specifying an image processing apparatus that has executed image processing designated by the operation, a function of the image processing apparatus that is necessary for the image processing, image data subjected to the image processing, and a storage location of image data obtained by the image processing. When allowing the display unit to display the operation screen, the controller of the portable terminal sorts the operation history so as to be displayed for each storage location of image data obtained by image processing designated by the operation history.
Preferably, the controller of the portable terminal allows the display unit to display the operation history in a selectable and changeable manner on the operation screen, and when accepting an instruction to change the operation history on the operation screen, transmits a control signal for allowing the image processing apparatus to execute image processing specified by the changed operation history, to the image processing apparatus.
In accordance with another aspect of the present invention, a control method for an image processing system is provided. The image processing system includes a portable terminal having a shooting unit and a display unit, equipment, at least one of which is an image processing apparatus, and an information processing apparatus. The information processing apparatus stores, as information about the equipment, positional information of the equipment and communication information for communicating with the equipment. The control method includes the steps of: causing the portable terminal to transmit positional information and orientation information at a time of shooting by the shooting unit of the portable terminal to the information processing apparatus; causing the information processing apparatus to detect one of the equipment that is included in an image shot by the shooting unit of the portable terminal, based on the positional information and orientation information received from the portable terminal and the positional information included in the information about the equipment, and to transmit the information about the detected equipment to the portable terminal; causing the portable terminal to obtain an operation history on the image processing apparatus in the equipment, based on the information about the equipment transmitted from the information processing apparatus, and to allow the display unit to display an operation screen presenting the operation history in a selectable manner; when accepting a selection of the operation history, causing the portable terminal to transmit a control signal for allowing the image processing apparatus to execute image processing specified by the selected operation history, to the image processing apparatus; and executing corresponding image processing based on the signal in the image processing apparatus.
In accordance with a further aspect of the present invention, a portable terminal includes a shooting unit, an obtaining unit for obtaining positional information and orientation information of the portable terminal, a display unit, an input unit for inputting an instruction on an operation screen displayed on the display unit, and a controller. The controller executes a process of transmitting positional information and orientation information at a time of shooting by the shooting unit to an information processing apparatus, a process of obtaining an operation history on an image processing apparatus in equipment based on information about equipment that is received from the information processing apparatus, and allowing the display unit to display an operation screen presenting the operation history in a selectable manner, and a process of accepting a selection of the operation history at the input unit and then transmitting a control signal for allowing the image processing apparatus to execute image processing specified by the selected operation history, to the image processing apparatus.
Preferably, the controller further executes a process of requesting the operation history from the equipment based on communication information for communicating with the equipment that is included in the received information about the equipment, in the process of allowing the display unit to display an operation screen, and a process of receiving the operation history from the equipment.
In accordance with a still further aspect of the present invention, a non-transitory computer-readable recording medium is encoded with a control program for causing a portable terminal to execute processing. The portable terminal includes a shooting unit and a display unit. The control program causes the portable terminal to execute the steps of: transmitting positional information and orientation information at a time of shooting by the shooting unit to an information processing apparatus; obtaining an operation history on an image processing apparatus in equipment based on information about equipment that is received from the information processing apparatus, and displaying an operation screen presenting the operation history in a selectable manner on the display unit; and accepting a selection of the operation history and then transmitting a control signal for allowing the image processing apparatus to execute image processing specified by the selected operation history, to the image processing apparatus.
Preferably, the step of the control program of displaying an operation screen on the display unit includes the steps of: requesting the operation history from the equipment based on communication information for communicating with the equipment that is included in the received information about the equipment; and receiving the operation history from the equipment.
The foregoing and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a diagram showing a specific example of a configuration of an image processing system according to an embodiment.
FIG. 2 is a diagram showing a specific example of a hardware configuration of an MFP (Multi-Functional Peripheral) included in the image processing system according to the embodiment.
FIG. 3 is a diagram showing a specific example of a hardware configuration of a PC included in the image processing system according to the embodiment.
FIG. 4 is a diagram showing a specific example of a hardware configuration of a portable terminal included in the image processing system according to the embodiment.
FIG. 5 is a diagram showing a specific example of a hardware configuration of a server included in the image processing system according to the embodiment.
FIG. 6A and FIG. 6B are diagrams showing a specific example of equipment information.
FIG. 7 is a block diagram showing a specific example of a functional configuration of the portable terminal.
FIG. 8 is a block diagram showing a specific example of a functional configuration of the server.
FIG. 9 is a sequence diagram depicting a flow of an operation for operating the MFP in an operation flow 1.
FIG. 10 is a flowchart showing a specific example of an operation in the portable terminal performing an operation for operating the MFP.
FIG. 11 is a diagram showing a specific example of an operation screen appearing on an operation panel of the portable terminal through the process in step S113 in FIG. 10.
FIG. 12 is a diagram showing another specific example of the operation screen.
FIG. 13 is a sequence diagram depicting a flow of an operation for operating the MFP in an operation flow 2.
FIG. 14 is a diagram showing a specific example of a select screen for selecting a device for which operation history is requested.
FIG. 15 is a diagram showing a specific example of a functional configuration of the portable terminal according to a modified embodiment.
DESCRIPTION OF THE PREFERRED EMBODIMENTS
In the following, an embodiment of the present invention will be described with reference to the figures. In the following description, the same parts and components are denoted with the same reference numerals. Their names and functions are also the same.
<System Configuration>
FIG. 1 is a diagram showing a specific example of a configuration of an image processing system according to an embodiment.
Referring to FIG. 1, the image processing system according to the present embodiment includes an MFP (Multi-Functional Peripheral) 100 serving as an image processing apparatus, a personal computer (hereinafter referred to as PC) 200 serving as a control device for controlling MFP 100, a server 300, and a portable terminal 400.
The image processing apparatus is not limited to an MFP and may be a printer, a facsimile machine, a copier, or any other similar device having at least one or more image processing functions. MFP 100 is an image processing apparatus that combines these functions.
As shown in FIG. 1, the image processing system may include a plurality of MFPs 100A, 100B, which are collectively referred to as MFP 100.
The information processing apparatus is not limited to a PC and may be any other device as long as it stores a program for controlling MFP 100, such as a printer driver, and includes a CPU (Central Processing Unit) 20 (FIG. 3) which executes the program to output an operation signal to MFP 100. Any other device may be, for example, a mobile phone or a portable document reader, and may be combined with portable terminal 400 described later.
As shown in FIG. 1, the image processing system may include a plurality of PCs 200A, 200B, which are collectively referred to as PC 200.
Portable terminal 400 is, for example, a mobile phone or any other device at least having a camera function, an instruction input function, and a communication function.
Server 300 is a general personal computer or any other similar device.
MFP 100, PC 200, and server 300 are connected to a wired or wireless network such as a LAN. Portable terminal 400 can also connect to the network, so that portable terminal 400 can communicate with each of MFP 100, PC 200, and server 300 via the network.
MFP 100, server 300, and PC 200 each have a function for performing wireless communication. Portable terminal 400 can wirelessly communicate with each of MFP 100, PC 200, and server 300. For example, communication using Bluetooth® or infrared communication can be used for wireless communication.
<Operation Overview>
In the image processing system according to the present embodiment, server 300 stores information of each of MFP 100 and PC 200 included in the image processing system, as “equipment information.”
The equipment information includes an operation history in each device. Specifically, in the case where the equipment is MFP 100, the operation history in each MFP 100 is included. In the case where the equipment is PC 200, the operation history about MFP 100 in each PC 200 is included.
The equipment information further includes positional information which is information representing the location of each device, and an IP address which is information for communication via LAN.
The operation history is transmitted to server 300 by MFP 100 and PC 200 and then registered in server 300. The operation history may be registered, by way of example, in such a manner that every time each device accepts an operation input and generates a control signal in accordance with the operation input, history information representing the control signal is generated and transmitted to server 300. In another example, each device may accumulate the history information in a predetermined memory and transmit the history information to server 300 at a preset timing, for example, at a timing as requested by server 300, at predetermined time intervals, at a timing when a predetermined amount of data is accumulated, or at a timing when the device connects to the LAN. The history information in each device may be deleted from the storage area upon being transmitted to server 300 or may be transmitted to server 300 while being left in the storage area.
The operation history at least includes information specifying MFP 100 that has executed the image processing, information (function name) specifying the function of MFP 100 that is used in the image processing, and data (document name) subjected to the image processing. The operation history may additionally include information specifying a device (target PC), which is a storage location of image data obtained as a result of the image processing, and may include information specifying the user who has performed the operation, as described later. For example, when the image processing is a process of storing image data obtained by scanning a document into a designated storage area, the storage location of image data obtained as a result of the image processing refers to the storage area or a device having the storage area. For example, when the image processing is a process of transmitting image data obtained by scanning a document to a designated device, the storage location refers to the device to which the image data is transmitted. The device serving as a storage location of image data may be a PC or MFP 100 itself.
The positional information is, for example, a combination of latitude, longitude and altitude, or the closest access point of the LAN. The positional information may be registered in server 300, for example, by an administrator when each device is installed, or may be registered in each device and transmitted to server 300 at a predetermined timing to be registered in server 300. Alternatively, when each device has a function of obtaining its own positional information, each device may obtain its own positional information at a predetermined timing using that function and may transmit the positional information to server 300 for registration in server 300.
The information for communication (hereinafter also referred to as communication information) may be, for example, an IP address. Alternatively, when Bluetooth® is used in communication as described above, the communication information may be a Bluetooth® address. When each device performs different kinds of communication, the communication information may be an address corresponding to each kind of communication. In the following description, it is assumed that each device included in the image processing system performs communication via the LAN and stores an IP address as the communication information.
The communication information may be registered in server 300, for example, by an administrator when each device is installed, or may be registered in each device and transmitted to server 300 at a predetermined timing to be registered in server 300. Alternatively, in a case where the communication information is automatically assigned to each device at a predetermined timing, for example, when connecting to the LAN, the communication information may be transmitted to server 300 by each device and registered in server 300 at the timing of being assigned or at a predetermined later timing.
The user who carries portable terminal 400 points portable terminal 400 at MFP 100 or PC 200 to shoot a photo, so that the equipment information of the shot device is transmitted from server 300 to portable terminal 400. An operation panel 45 (FIG. 4) of portable terminal 400 displays an operation screen which allows a selection of operation history included in the equipment information.
On the operation screen, the user selects the operation history that represents the operation that the user wants to have MFP 100 to execute. Accordingly, a control signal for executing the same image processing as the image processing represented by the operation history is transmitted from portable terminal 400 to MFP 100, so that MFP 100 executes the image processing.
In the following, the device configurations for performing these operations will be described.
<Configuration of MFP>
FIG. 2 shows a specific example of a hardware configuration of MFP 100.
Referring to FIG. 2, MFP 100 includes a CPU (Central Processing Unit) 10 as an arithmetic unit for controlling MFP 100 as a whole, a ROM (Read Only Memory) 11 for storing a program executed in CPU 10, a RAM (Random Access Memory) 12 functioning as a work area for executing a program in CPU 10, a scanner 13 for optically scanning a document placed on a not-shown platen to obtain image data, a printer 14 for fixing image data on print paper, an operation panel 15 including a touch panel for displaying information and accepting an operation input to MFP 100, a memory 16 for storing image data, and a network controller 17 for controlling communication via the LAN.
<Configuration of PC>
FIG. 3 shows a specific example of a hardware configuration of PC 200.
Referring to FIG. 3, PC 200 includes a CPU 20 as an arithmetic unit for controlling PC 200 as a whole, a ROM 21 for storing a program executed in CPU 20, a RAM 22 functioning as a work area for executing a program in CPU 20, an operation unit 25 for performing an operation input, a memory 26 for storing a variety of information and operation history, and a network controller 27 for controlling communication via the LAN.
The hardware configuration shown in FIG. 3 is a hardware configuration of a general personal computer, and the hardware configuration of PC 200 is not limited to the one shown in FIG. 3. Specifically, any other configuration may be included, or any other configuration may be included in place of the configuration shown in FIG. 3.
<Configuration of Portable Terminal>
FIG. 4 shows a specific example of a hardware configuration of portable terminal 400.
Referring to FIG. 4, portable terminal 400 includes a CPU 40 as an arithmetic unit for controlling portable terminal 400 as a whole, a ROM 41 for storing a program executed in CPU 40, a RAM 42 functioning as a work area for executing a program in CPU 40, an electronic compass 43 including a magnetic sensor for detecting an orientation of portable terminal 400, a GPS (Global Positioning System) controller 44 receiving a GPS signal or a positional signal from a base station for obtaining positional information of portable terminal 400, an operation panel 45 including a touch panel for displaying information and accepting an operation input to portable terminal 400, a camera 46, and a network controller 47 for controlling communication via the LAN.
Operation panel 45 may be configured similar to operation panel 15 of MFP 100. More specifically, it includes, for example, a touch panel formed of a display such as a liquid crystal display and a position designating device such as an optical touch panel or a capacitive touch panel, and operation keys.
CPU 40 allows the touch panel to display an operation screen based on the operation history included in the equipment information transmitted from server 300 as described later. CPU 40 specifies the designated position on the touch panel, generates a control signal for allowing MFP 100 to execute image processing based on screen data of the operation screen and the specified position, and transmits the control signal to MFP 100.
Electronic compass 43 and GPS controller 44 output a signal to CPU 40 to indicate the obtained orientation or positional information of portable terminal 400.
It is noted that the hardware configuration shown in FIG. 4 is a hardware configuration necessary for portable terminal 400 to execute the operation illustrated in the operation overview above, and portable terminal 400 is not limited to the one only including this hardware configuration. As any other hardware configuration, for example, a speaker, a microphone, and a communication controller for communicating with a base station may be included in a case where portable terminal 400 has a call function.
<Configuration of Server>
FIG. 5 shows a specific example of a hardware configuration of server 300.
Referring to FIG. 5, server 300 is formed, for example, of a general computer as described above. By way of example, server 300 includes a CPU 30 as an arithmetic unit for controlling server 300 as a whole, a ROM 31 for storing a program executed in CPU 30, a RAM 32 functioning as a work area for executing a program in CPU 30, an HD (Hard Disk) 33 for storing the equipment information and the like, and a network controller 34 for controlling communication via the LAN.
FIG. 6A and FIG. 6B show a specific example of the equipment information stored in HD 33. FIG. 6A illustrates the equipment information of MFP 100, and FIG. 6B illustrates the equipment information of PC 200.
Specifically, referring to FIG. 6A, for each MFP, stored as the equipment information of MFP 100 are information (equipment name) specifying the MFP, positional information, an IP address as communication information, and the operation history.
Referring to FIG. 6B, similarly, for each PC, stored as the equipment information of PC 200 are information (equipment name) for specifying the PC, positional information, an IP address as communication information, and the operation history of MFP 100.
<Functional Configuration>
The functional configuration of each device for implementing the operation as illustrated in the operation overview in the image processing system according to the present embodiment will be described.
FIG. 7 is a block diagram showing a specific example of a functional configuration of portable terminal 400. Each function shown in FIG. 7 is a function mainly formed in CPU 40 when CPU 40 reads out a program stored in ROM 41 and executes the program on RAM 42. However, at least part of the functions may be formed by the hardware configuration shown in FIG. 4.
Referring to FIG. 7, as functions for implementing the operation as described above, portable terminal 400 includes an instruction input unit 401 for accepting an instruction input from operation panel 45, a position obtaining unit 402 for obtaining positional information of portable terminal 400 in response to a shooting instruction from operation panel 45; an orientation obtaining unit 403 for obtaining an orientation of portable terminal 400 in response to a shooting instruction from operation panel 45, an image obtaining unit 404 for obtaining image data captured by shooting by camera 46 in response to a shooting instruction from operation panel 45, a server request unit 405 for requesting the equipment information from server 300 together with the positional information and orientation information obtained in response to a shooting instruction from operation panel 45, an information obtaining unit 406 for obtaining the equipment information from server 300 in response to the request, a screen generation unit 407 for generating screen data for allowing operation panel 45 to display an operation screen based on the obtained equipment information, a command generation unit 408 for generating a control signal, and a transmission unit 409 for transmitting the generated control signal to MFP 100 serving as a control target.
Since position obtaining unit 402 and orientation obtaining unit 403 obtain positional information and orientation information, respectively, in response to a shooting instruction from operation panel 45, the information thereof can serve as information of a shooting position and information of a shooting direction, respectively. Then, in the description below, the positional information and orientation information transmitted from portable terminal 400 to server 300 in accordance with the program are also referred to as shooting position information and shooting direction information, respectively.
Screen generation unit 407 generates screen data for allowing operation panel 45 to display an operation screen which allows a selection of operation history, by referring to the operation history included in the equipment information. CPU 40 performs a display process for allowing operation panel 45 to display an operation screen based on the screen data, whereby the operation screen appears on operation panel 45.
Instruction input unit 401 also accepts an instruction input from operation panel 45, which specifies a position on operation panel 45 displaying the operation screen. A signal specifying the position represented by the instruction input is input to command generation unit 408.
Command generation unit 408 specifies the designated position on the operation screen, based on the signal and the screen data. Command generation unit 408 stores the correspondence between a position on the screen data and a position on operation panel 45 beforehand and specifies the operation history corresponding to the position based on the correspondence. Then, a control signal for allowing MFP 100 to execute image processing represented by the operation history is generated. The generated control signal is input to transmission unit 409.
Transmission unit 409 performs a process for transmitting the generated control signal to the IP address, by referring to the IP address included in the equipment information.
FIG. 8 is a block diagram showing a specific example of a functional configuration of server 300. Each function shown in FIG. 8 is a function mainly formed in CPU 30 when CPU 30 reads out a program stored in ROM 31 and executes the program on RAM 32. However, at least part of the functions may be formed by the hardware configuration shown in FIG. 5.
Referring to FIG. 8, as functions for implementing the operation as described above, server 300 includes an equipment information input unit 301 for accepting input of positional information, history information, and address information from each equipment, an equipment information storage unit 302 for storing or updating the equipment information input into a predetermined area in HD 33, a portable information input unit 303 for accepting input of a shooting position and a shooting direction together with a request for equipment information from portable terminal 400, a search unit 304 for searching for the equipment present in the shot image by portable terminal 400 based on the shooting position and the shooting direction of portable terminal 400, and a transmission unit 305 for transmitting the equipment information about the found equipment to portable terminal 400.
<Operation Flow 1>
As an operation flow 1, a case where MFP 100 is operated using the operation history included in the equipment information stored in server 300 will be described.
FIG. 9 is a sequence diagram illustrating a flow of an operation for operating the MFP in operation flow 1. FIG. 9 shows a flow of processing in MFP 100 on the left side, a flow of processing in portable terminal 400 at the middle, and a flow of processing in server 300 on the right side. Each operation is implemented when the CPU of each device reads out a program stored in the ROM and executes the program on the RAM.
First, as a precondition of the operation, a process of transmitting an operation history from MFP 100 to server 300 is performed, and the operation history is registered as the equipment information.
Specifically, referring to FIG. 9, upon execution of image processing in MFP 100 in accordance with a user's instruction (#01), the operation history concerning the image processing is transmitted to server 300 at a predetermined timing (#01-1). At server 300 receiving the operation history, a process for registering the received operation history in the equipment information of MFP 100 is executed (#03).
As a result of this precondition operation, the operation history of MFP 100 is stored as the equipment information in server 300.
Next, as an operation for operating the MFP, in a state in which application for operating the MFP is being activated in portable terminal 400 (step S1), the camera shoots a photo (step S3). Thereafter, upon input of an instruction for operating the MFP (step S5), the information that specifies a shooting position and a shooting direction at portable terminal 400 is transmitted to server 300, whereby the corresponding equipment information is requested (step S5-1).
Server 300 accepts the request from portable terminal 400, specifies the equipment located in a prescribed range in the shooting direction from the shooting position of portable terminal 400, by referring to each positional information in the stored equipment information, and searches for the equipment information about the specified equipment (step S7). Then, the corresponding equipment information is transmitted to portable terminal 400 (step S7-1). Here, of the equipment information, at least the operation history and communication information are transmitted.
At portable terminal 400, the application causes an operation screen to appear to present the operation history included in the equipment information in a selectable manner (step S9).
When an operation history is selected (touched) on the operation screen appearing on operation panel 45 of portable terminal 400 (step S11), a control signal is generated for allowing MFP 100 to execute image processing indicated by the selected operation history (step S13). Then, the control signal is transmitted to MFP 100 (step S13-1).
At MFP 100 receiving the control signal, the designated image processing is executed in accordance with the control signal (step S15).
It is noted that in a case where the image processing system requires a user authentication process as a precondition, the operation history may be associated with information (for example, a user ID) specifying the user who performs the operation, as described above. Server 300 may store the correspondence between information specifying portable terminal 400 and a user ID beforehand. In this case, in response to a request from portable terminal 400, in step S7, server 300 may specify the user ID related to portable terminal 400 and search for the equipment information which is about the equipment located within a predetermined range in the shooting direction from the shooting position of portable terminal 400 and which includes the operation history associated with the user ID. Then, the found equipment information may be transmitted in step S7-1.
In this manner, in step S9, an operation screen appears which allows the user related to portable terminal 400 to select an operation history. Therefore, in the case where the same operation as the operation performed before is repeated, an operation screen more easy to operate is displayed.
In the example in FIG. 9, the equipment information of MFP 100 is transmitted as equipment information from server 300. This is an example of the operation in the case where MFP 100 is shot by portable terminal 400. Similarly, when PC 200 is shot by portable terminal 400, the operation history on MFP 100 in PC 200 is received from server 300. In this case, as a precondition, when PC 200 generates and transmits a command for executing image processing to MFP 100 according to a user's instruction, PC 200 transmits the operation history concerning the command to server 300 at a predetermined timing. Accordingly, the operation history on MFP 100 is registered as the equipment information of PC 200 in server 300.
The operation in portable terminal 400 will be described in detail below using the flowchart.
FIG. 10 is a flowchart illustrating a specific example of an operation in portable terminal 400 performing an operation for operating the MFP. The operation shown in the flowchart in FIG. 10 is implemented when CPU 40 reads out a program stored in ROM 41 corresponding to the application for operating the MFP and executes the read program on RAM 42.
Referring to FIG. 10, in a state in which CPU 40 is executing the application for operating the MFP (YES in step S101), and if an instruction for operating the MFP is input from operation panel 45 (YES in step S103), then, in step S105, CPU 40 transmits information representing a shooting position and a shooting direction to server 300 and requests transmission of the equipment information of the corresponding equipment.
When a response is received from server 300 (YES in step S107), and when the response is the equipment information (NO in step S109) and the equipment information includes the operation history of MFP (YES in step S111), CPU 40 executes a process for displaying an operation screen presenting the operation history in a selectable manner on operation panel 45, in step S113.
FIG. 11 is a diagram showing a specific example of the operation screen appearing on operation panel 45 through the process in step S113 as described above.
FIG. 11 shows a specific example of the operation screen appearing when the equipment information of MFP 100 is received as equipment information from server 300. In this case, referring to FIG. 11, the operation history displayed as a choice in the operation screen includes, for each image processing performed in MFP 100, information (function name) specifying the function of MFP 100 that is used in the image processing, data (document name) subjected to the image processing, and information specifying a device (target PC) which is a storage location of image data obtained as a result of the image processing. The operation history shown as a choice in the operation screen is not necessarily displayed with all of the above-noted information and may be displayed with at least only one of them. Alternatively, only the information specifying the operation history itself may be displayed. In this case, when the operation history based on such information is selected, the next screen or a pop-up screen may appear to display the detailed contents of the selected operation history.
Preferably, when displaying the operation screen in step S113, CPU 40 sorts the operation history so as to be displayed for each storage location of image data, and generates screen data for the operation screen. Alternatively, preferably, when displaying the operation screen in step S113, CPU 40 sorts the operation history so as to be displayed for each function of the MFP that is necessary for the image processing designated by the operation history, and generates screen data for the operation screen. FIG. 11 illustrates the operation screen based on the screen data generated by sorting the operation history in any of the foregoing manners. In this way, as shown in FIG. 11, the operation history is displayed for each storage location of image data or for each function of the MFP, thereby allowing the user to easily find the desired operation history.
Similarly, when the equipment information of PC 200 is received as equipment information from server 300, preferably, when displaying the operation screen in step S113, CPU 40 sorts the operation history so as to be displayed for each MFP that has executed image processing designated by the operation history, and generates screen data for the operation screen. In this manner, the operation history is displayed for each MFP, thereby allowing the user to easily find the desired operation history.
Upon accepting an operation input on the operation screen (YES in step S115), in step S117, CPU 40 generates a control signal for allowing MFP 100 to execute the image processing indicated by the operation history designated by the operation input.
It is noted that, in step S115, an operation to change the operation history may be accepted in place of an instruction to select. This is applicable to the example described below.
As an example of this case, when generating screen data for displaying the operation screen as shown in FIG. 11 as the operation screen, CPU 40 generates screen data for displaying a function name, a document name, or a storage location of image data in a changeable manner in each operation history. For example, as shown in FIG. 12, a pull-down button is displayed next to each of a function name, a document name, and a target PC. Pressing the button causes another function, another document, or another device to show up as a choice. Other functions of MFP 100, other document names, and other devices may be stored beforehand in portable terminal 400, or may be obtained with reference to the function included in any other operation history, or may be included in the equipment information about MFP 100 that is transmitted from server 300.
In this case, in step S117, CPU 40 generates a control signal for allowing MFP 100 to execute image processing indicated by the changed operation history, based on the changed operation history.
<Operation Flow 2>
As an operation flow 2, a case where the operation history is stored in each device and the equipment information including information other than the operation history is stored in server 300 will be described. In other words, in this case, server 300 stores the positional information and communication information as the equipment information of each of MFP 100 and PC 200, and does not have to store the operation history.
FIG. 13 is a sequence diagram illustrating a flow of an operation for operating the MFP in operation flow 2. FIG. 13 shows a flow of processing in MFP 100 on the left, a flow of processing in portable terminal 400, second from left, a flow of processing in server 300, third from left, and a flow of processing in PC 200 on the right. Each operation is implemented when the CPU of each device reads out a program stored in the ROM and executes the program on the RAM.
Referring to FIG. 13, similarly to the flow up to step S5 in the operation illustrated in FIG. 9, the information specifying the shooting position and shooting direction at portable terminal 400 is transmitted to server 300 in step S5-1. Upon request for the corresponding equipment information, the equipment information about the corresponding device is searched for in server 300 in step S7, similarly as in operation flow 1 illustrated in FIG. 9. The found equipment information is transmitted from server 300 to portable terminal 400 in step S7-1′. In operation flow 2, the equipment information does not have to include the operation history as a precondition, and of the equipment information, at least the communication information is transmitted in step S7-1′.
In operation flow 2, portable terminal 400 receiving the equipment information specifies the equipment from which the operation history is requested, based on the communication information included in the equipment information (step S8). The operation history is requested from the specified equipment (step S8-1). Then, in response to the request, the operation history of MFP 100 is transmitted to portable terminal 400 (step S8-2).
Similarly as in the description of operation flow 1, in the case where the operation history is associated with the user ID of the user who has performed the operation, each device stores the correspondence between the information specifying portable terminal 400 and the user ID beforehand, so that each device can transmit the operation history associated with the user ID corresponding to portable terminal 400 that has requested the operation history, among the stored operation history, to portable terminal 400, in step S8-2.
FIG. 13 shows an example in which the equipment information about PC 200 is transmitted from server 300 to portable terminal 40, and the operation history of MFP 100 is requested by portable terminal 400 from PC 200 based on the equipment information. FIG. 13 is an example of operation flow 2, which is applicable to a case where the equipment information about MFP 100 is transmitted from server 300 to portable terminal 400. More specifically, also in this case, the equipment information does not include the operation history as a precondition, and portable terminal 400 receiving the equipment information requests the operation history from MFP 100 based on the communication information included in the equipment information and obtains the operation history from MFP 100 responding to the request.
The following operation after step S9 is similar to operation flow 1 shown in FIG. 9.
<Effects of Embodiment>
Through the operation as described above in the image processing system according to the embodiment, MFP 100 can be operated using portable terminal 400 as described in the operation overview.
As the recent MFPs have been more sophisticated, even more functions are installed and available in MFPs. In such a case, many options for selecting a function to be operated from those functions as well as options for operation in those functions are displayed on the operation screens appearing on operation panel 15 of MFP 100. Therefore, such operation screens are difficult to operate for the users unfamiliar with them.
In such a case, in the image processing system according to the embodiment, the user can activate the dedicated application in portable terminal 400 familiar to them and points portable terminal 400 to shoot a device, so that the operation history on MFP 100 in the device is displayed in a selectable manner. Then, the same operation history as the desired operation is selected therefrom, thereby allowing MFP 100 to execute the image processing designated by the operation.
Accordingly, the user can easily perform an operation for executing image processing indicated by the operation history. In addition, even when MFP 100 is located at a distance from the user, the user does not have to move there and can operate MFP 100 with portable terminal 400 the user carries.
<Modified Embodiment>
In the foregoing description, as a precondition of the operation for operating MFP 100 using portable terminal 400, one of the devices included in the image processing system is shot by camera 46, and the shooting position and shooting direction then are transmitted to server 300. Server 300 stores the positional information of each device beforehand and specifies the device located within a shooting range obtained from the shooting position and shooting direction from portable terminal 400.
However, it is not always necessary to shoot a photo with portable terminal 400 at the start of the operation, as a precondition of the operation for operating MFP 100, and an image shot and stored before may be used.
In this case, image data obtained by shooting a device included in the image processing system with camera 46 is stored beforehand in portable terminal 400, in association with the positional information and orientation information at the time of shooting. In other words, a shot image is stored in portable terminal 400 in association with the shooting position and shooting direction.
At portable terminal 400, the application for operating the MFP is activated to display a screen for selecting a device to be operated. On this screen, images shot before are displayed as choices from which a target device is selected. FIG. 14 is a diagram showing a specific example of a select screen for selecting a device for which operation history is requested. Referring to FIG. 14, it is assumed, by way of example, that image data of each device shot before is stored in portable terminal 400, and the application is activated to display the shot images in a selectable manner. In place of the shot images, an icon for selecting each device may be displayed, or an entry field for designating one of the devices in text may be displayed. It is noted that in these cases, the shooting position and shooting direction of the corresponding device are associated and stored beforehand.
FIG. 15 is a diagram showing a specific example of a functional configuration of portable terminal 400 according to a modified embodiment. Each function shown in FIG. 15 is also a function mainly formed in CPU 40 when CPU 40 reads out a program stored in ROM 41 and executes the program on RAM 42. However, at least part of the functions may be formed by the hardware configuration shown in FIG. 4.
Referring to FIG. 15, portable terminal 400 according to the modified embodiment includes an equipment specifying unit 410 in place of position obtaining unit 402, orientation obtaining unit 403, and image obtaining unit 404 shown in FIG. 7.
Equipment specifying unit 410 accepts input of a signal indicating an operation position on the select screen as shown in FIG. 14 at instruction input unit 401 from operation panel 45, specifies the selected equipment based on the signal, and outputs the shooting position and shooting direction stored in association with the equipment to server request unit 405. Server request unit 405 of portable terminal 400 according to the modified embodiment requests the equipment information from server 300 along with the input shooting position and shooting direction.
Through the operation shown in the modified embodiment, when image data of the equipment shot before is stored, MFP 100 can be operated using the shot image without shooting a photo again. Even when portable terminal 400 does not have a shooting function, if the information specifying each device is stored beforehand, MFP 100 can be operated using the stored information.
The present invention also provides a program for allowing each device included in the image processing system to execute the foregoing operation. Such a program may be stored in a computer-readable recording medium accompanying a computer, such as a flexible disk, a CD-ROM (Compact Disk-Read Only Memory), a ROM, a RAM, and a memory card, and be provided as a program product. Alternatively, the program may be stored in a recording medium such as a hard disk contained in a computer. The program may be downloaded via a network.
The program in accordance with the present invention may allow the process to be executed by invoking necessary modules, among program modules provided as a part of Operating System (OS) of a computer, in a prescribed sequence at a prescribed timing. In this case, the modules are not included in the program itself and the process is executed in cooperation with OS. The program that does not include such modules may also be included in the program in accordance with the present invention.
Furthermore, the program in accordance with the present invention may be embedded in a part of another program. In this case, the modules included in another program are not included in the program itself, and the process is executed in cooperation with another program. Such a program embedded in another program may also be included in the program in accordance with the present invention.
The provided program product is installed in a program storage unit such as a hard disk for execution. It is noted that the program product includes the program itself and a recording medium encoded with the program.
Although the present invention has been described and illustrated in detail, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation, the scope of the present invention being interpreted by the terms of the appended claims.

Claims (15)

What is claimed is:
1. An image processing system comprising:
a portable terminal;
equipment; and
an information processing apparatus,
wherein at least one of said equipment is an image processing apparatus including a controller,
said portable terminal including
a shooting unit,
an obtaining unit for obtaining positional information and orientation information of said portable terminal,
a display unit, and
an input unit for inputting an instruction on an operation screen displayed on said display unit,
said information processing apparatus including
a storage unit for storing, as information about said equipment, positional information of said equipment and communication information for communicating with said equipment, wherein
said portable terminal transmits positional information and orientation information at a time of shooting by said shooting unit to said information processing apparatus,
said information processing apparatus detects said equipment included in an image shot by said shooting unit of said portable terminal, based on the positional information and orientation information at a time of shooting at said portable terminal, and transmits information about said detected equipment to said portable terminal, and
said portable terminal further includes a controller for executing a process of obtaining an operation history on said image processing apparatus in said equipment, based on said received information about said equipment, allowing said display unit to display an operation screen presenting said operation history in a selectable manner, and when accepting a selection of said operation history at said input unit, transmitting a control signal for allowing said image processing apparatus to execute image processing specified by said selected operation history, to said image processing apparatus.
2. The image processing system according to claim 1, wherein
said information processing apparatus further stores an operation history on said image processing apparatus in said equipment as said information about said equipment, and
said controller of said portable terminal allows said display unit to display said operation screen using said operation history included in said received information about said equipment.
3. The image processing system according to claim 1, wherein said controller of said portable terminal requests an operation history on said image processing apparatus from said equipment, based on said communication information for communicating with said equipment that is included in said received information about said equipment, and obtains said operation history received from said equipment.
4. The image processing system according to claim 1, wherein
said portable terminal further includes a storage unit for storing positional information and orientation information at a time when said equipment is shot, in association with information specifying said equipment, and
when accepting a selection of said equipment, said portable terminal transmits positional information and orientation information at a time when said selected equipment is shot, to said information processing apparatus.
5. The image processing system according to claim 1, wherein said equipment includes a controller for controlling said image processing apparatus.
6. The image processing system according to claim 1, wherein said operation history includes information specifying an image processing apparatus that has executed image processing designated by the operation, a function of said image processing apparatus that is necessary for said image processing, and image data subjected to said image processing.
7. The image processing system according to claim 6, wherein when allowing said display unit to display said operation screen, said controller of said portable terminal sorts said operation history so as to be displayed for each image processing apparatus that has executed image processing designated by said operation history.
8. The image processing system according to claim 6, wherein when allowing said display unit to display said operation screen, said controller of said portable terminal sorts said operation history so as to be displayed for each function of said image processing apparatus that is necessary for image processing designated by said operation history.
9. The image processing system according to claim 1, wherein
said operation history includes information specifying an image processing apparatus that has executed image processing designated by the operation, a function of said image processing apparatus that is necessary for said image processing, image data subjected to said image processing, and a storage location of image data obtained by said image processing, and
when allowing said display unit to display said operation screen, said controller of said portable terminal sorts said operation history so as to be displayed for each storage location of image data obtained by image processing designated by said operation history.
10. The image processing system according to claim 1, wherein said controller of said portable terminal allows said display unit to display said operation history in a selectable and changeable manner on said operation screen, and when accepting an instruction to change said operation history on said operation screen, transmits a control signal for allowing said image processing apparatus to execute image processing specified by said changed operation history, to said image processing apparatus.
11. A control method for an image processing system including a portable terminal having a shooting unit and a display unit, equipment, at least one of which is an image processing apparatus, and an information processing apparatus,
wherein said information processing apparatus stores, as information about said equipment, positional information of said equipment and communication information for communicating with said equipment,
said control method comprising the steps of:
causing said portable terminal to transmit positional information and orientation information at a time of shooting by said shooting unit of said portable terminal to said information processing apparatus;
causing said information processing apparatus to detect equipment included in an image shot by said shooting unit of said portable terminal, based on said positional information and orientation information transmitted from said portable terminal and the positional information included in said information about said equipment, and to transmit said information about said detected equipment to said portable terminal;
causing said portable terminal to obtain an operation history on said image processing apparatus in said equipment, based on said information about said equipment transmitted from said information processing apparatus, and to allow said display unit to display an operation screen presenting said operation history in a selectable manner;
when accepting a selection of said operation history, causing said portable terminal to transmit a control signal for allowing said image processing apparatus to execute image processing specified by said selected operation history, to said image processing apparatus; and
executing corresponding image processing based on said signal in said image processing apparatus.
12. A portable terminal comprising:
a shooting unit;
an obtaining unit for obtaining positional information and orientation information of said portable terminal;
a display unit;
an input unit for inputting an instruction on an operation screen displayed on said display unit; and
a controller,
wherein said controller executes
a process of transmitting positional information and orientation information at a time of shooting by said shooting unit to an information processing apparatus,
a process of obtaining an operation history on an image processing apparatus in equipment based on information about equipment that is received from said information processing apparatus, and allowing said display unit to display an operation screen presenting said operation history in a selectable manner, and
a process of accepting a selection of said operation history at said input unit and then transmitting a control signal for allowing said image processing apparatus to execute image processing specified by said selected operation history, to said image processing apparatus.
13. The portable terminal according to claim 12, wherein said controller further executes
a process of requesting said operation history from said equipment based on communication information for communicating with said equipment that is included in said received information about said equipment, in the process of allowing said display unit to display an operation screen, and
a process of receiving said operation history from said equipment.
14. A non-transitory computer-readable recording medium encoded with a control program for causing a portable terminal to execute processing,
said portable terminal including a shooting unit and a display unit,
said control program causing said portable terminal to execute the steps of:
transmitting positional information and orientation information at a time of shooting by said shooting unit to an information processing apparatus;
obtaining an operation history on an image processing apparatus in equipment based on information about equipment that is received from said information processing apparatus, and displaying an operation screen presenting said operation history in a selectable manner on said display unit; and
accepting a selection of said operation history and then transmitting a control signal for allowing said image processing apparatus to execute image processing specified by said selected operation history, to said image processing apparatus.
15. The non-transitory computer-readable recording medium according to claim 14, wherein said step of said control program of displaying an operation screen on said display unit includes the steps of:
requesting said operation history from said equipment based on communication information for communicating with said equipment that is included in said received information about said equipment; and
receiving said operation history from said equipment.
US13/296,966 2010-11-15 2011-11-15 Image processing system with ease of operation Active 2032-01-16 US8471914B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010-254514 2010-11-15
JP2010254514A JP5212448B2 (en) 2010-11-15 2010-11-15 Image processing system, control method for image processing apparatus, portable terminal, and control program

Publications (2)

Publication Number Publication Date
US20120120259A1 US20120120259A1 (en) 2012-05-17
US8471914B2 true US8471914B2 (en) 2013-06-25

Family

ID=46047419

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/296,966 Active 2032-01-16 US8471914B2 (en) 2010-11-15 2011-11-15 Image processing system with ease of operation

Country Status (3)

Country Link
US (1) US8471914B2 (en)
JP (1) JP5212448B2 (en)
CN (1) CN102469232B (en)

Cited By (140)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9083770B1 (en) 2013-11-26 2015-07-14 Snapchat, Inc. Method and system for integrating real time communication features in applications
US9094137B1 (en) 2014-06-13 2015-07-28 Snapchat, Inc. Priority based placement of messages in a geo-location based event gallery
US9225897B1 (en) * 2014-07-07 2015-12-29 Snapchat, Inc. Apparatus and method for supplying content aware photo filters
US9237202B1 (en) 2014-03-07 2016-01-12 Snapchat, Inc. Content delivery network for ephemeral objects
US9276886B1 (en) 2014-05-09 2016-03-01 Snapchat, Inc. Apparatus and method for dynamically configuring application component tiles
US9385983B1 (en) 2014-12-19 2016-07-05 Snapchat, Inc. Gallery of messages from individuals with a shared interest
US9396354B1 (en) 2014-05-28 2016-07-19 Snapchat, Inc. Apparatus and method for automated privacy protection in distributed images
US9537811B2 (en) 2014-10-02 2017-01-03 Snap Inc. Ephemeral gallery of ephemeral messages
US9705831B2 (en) 2013-05-30 2017-07-11 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US9721394B2 (en) 2012-08-22 2017-08-01 Snaps Media, Inc. Augmented reality virtual content platform apparatuses, methods and systems
US9742713B2 (en) 2013-05-30 2017-08-22 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US9843720B1 (en) 2014-11-12 2017-12-12 Snap Inc. User interface for accessing media at a geographic location
US9854219B2 (en) 2014-12-19 2017-12-26 Snap Inc. Gallery of videos set to an audio time line
US9866999B1 (en) 2014-01-12 2018-01-09 Investment Asset Holdings Llc Location-based messaging
US9882907B1 (en) 2012-11-08 2018-01-30 Snap Inc. Apparatus and method for single action control of social network profile access
US9936030B2 (en) 2014-01-03 2018-04-03 Investel Capital Corporation User content sharing system and method with location-based external content integration
US10055717B1 (en) 2014-08-22 2018-08-21 Snap Inc. Message processor with application prompts
US10084735B1 (en) 2014-02-21 2018-09-25 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
US10123166B2 (en) 2015-01-26 2018-11-06 Snap Inc. Content request by location
US10135949B1 (en) 2015-05-05 2018-11-20 Snap Inc. Systems and methods for story and sub-story navigation
US10133705B1 (en) 2015-01-19 2018-11-20 Snap Inc. Multichannel system
US10157449B1 (en) 2015-01-09 2018-12-18 Snap Inc. Geo-location-based image filters
US10165402B1 (en) 2016-06-28 2018-12-25 Snap Inc. System to track engagement of media items
US10203855B2 (en) 2016-12-09 2019-02-12 Snap Inc. Customized user-controlled media overlays
US10219111B1 (en) 2018-04-18 2019-02-26 Snap Inc. Visitation tracking system
US10223397B1 (en) 2015-03-13 2019-03-05 Snap Inc. Social graph based co-location of network users
US10284508B1 (en) 2014-10-02 2019-05-07 Snap Inc. Ephemeral gallery of ephemeral messages with opt-in permanence
US10311916B2 (en) 2014-12-19 2019-06-04 Snap Inc. Gallery of videos set to an audio time line
US10319149B1 (en) 2017-02-17 2019-06-11 Snap Inc. Augmented reality anamorphosis system
US10327096B1 (en) 2018-03-06 2019-06-18 Snap Inc. Geo-fence selection system
US10334307B2 (en) 2011-07-12 2019-06-25 Snap Inc. Methods and systems of providing visual content editing functions
US10348662B2 (en) 2016-07-19 2019-07-09 Snap Inc. Generating customized electronic messaging graphics
US10354425B2 (en) 2015-12-18 2019-07-16 Snap Inc. Method and system for providing context relevant media augmentation
US10366543B1 (en) 2015-10-30 2019-07-30 Snap Inc. Image based tracking in augmented reality systems
US10387730B1 (en) 2017-04-20 2019-08-20 Snap Inc. Augmented reality typography personalization system
US10387514B1 (en) 2016-06-30 2019-08-20 Snap Inc. Automated content curation and communication
US10423983B2 (en) 2014-09-16 2019-09-24 Snap Inc. Determining targeting information based on a predictive targeting model
US10430838B1 (en) 2016-06-28 2019-10-01 Snap Inc. Methods and systems for generation, curation, and presentation of media collections with automated advertising
US10439972B1 (en) 2013-05-30 2019-10-08 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US10474321B2 (en) 2015-11-30 2019-11-12 Snap Inc. Network resource location linking and visual content sharing
US10499191B1 (en) 2017-10-09 2019-12-03 Snap Inc. Context sensitive presentation of content
US10523625B1 (en) 2017-03-09 2019-12-31 Snap Inc. Restricted group content collection
US10582277B2 (en) 2017-03-27 2020-03-03 Snap Inc. Generating a stitched data stream
US10581782B2 (en) 2017-03-27 2020-03-03 Snap Inc. Generating a stitched data stream
US10592574B2 (en) 2015-05-05 2020-03-17 Snap Inc. Systems and methods for automated local story generation and curation
US10616239B2 (en) 2015-03-18 2020-04-07 Snap Inc. Geo-fence authorization provisioning
US10623666B2 (en) 2016-11-07 2020-04-14 Snap Inc. Selective identification and order of image modifiers
US10638256B1 (en) 2016-06-20 2020-04-28 Pipbin, Inc. System for distribution and display of mobile targeted augmented reality content
US10679393B2 (en) 2018-07-24 2020-06-09 Snap Inc. Conditional modification of augmented reality object
US10679389B2 (en) 2016-02-26 2020-06-09 Snap Inc. Methods and systems for generation, curation, and presentation of media collections
US10678818B2 (en) 2018-01-03 2020-06-09 Snap Inc. Tag distribution visualization system
US10740974B1 (en) 2017-09-15 2020-08-11 Snap Inc. Augmented reality system
US10805696B1 (en) 2016-06-20 2020-10-13 Pipbin, Inc. System for recording and targeting tagged content of user interest
US10817898B2 (en) 2015-08-13 2020-10-27 Placed, Llc Determining exposures to content presented by physical objects
US10824654B2 (en) 2014-09-18 2020-11-03 Snap Inc. Geolocation-based pictographs
US10834525B2 (en) 2016-02-26 2020-11-10 Snap Inc. Generation, curation, and presentation of media collections
US10839219B1 (en) 2016-06-20 2020-11-17 Pipbin, Inc. System for curation, distribution and display of location-dependent augmented reality content
US10862951B1 (en) 2007-01-05 2020-12-08 Snap Inc. Real-time display of multiple images
US10885136B1 (en) 2018-02-28 2021-01-05 Snap Inc. Audience filtering system
US10915911B2 (en) 2017-02-03 2021-02-09 Snap Inc. System to determine a price-schedule to distribute media content
US10933311B2 (en) 2018-03-14 2021-03-02 Snap Inc. Generating collectible items based on location information
US10948717B1 (en) 2015-03-23 2021-03-16 Snap Inc. Reducing boot time and power consumption in wearable display systems
US10952013B1 (en) 2017-04-27 2021-03-16 Snap Inc. Selective location-based identity communication
US10963529B1 (en) 2017-04-27 2021-03-30 Snap Inc. Location-based search mechanism in a graphical user interface
US10979752B1 (en) 2018-02-28 2021-04-13 Snap Inc. Generating media content items based on location information
US10993069B2 (en) 2015-07-16 2021-04-27 Snap Inc. Dynamically adaptive media content delivery
US10997783B2 (en) 2015-11-30 2021-05-04 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US10997760B2 (en) 2018-08-31 2021-05-04 Snap Inc. Augmented reality anthropomorphization system
US11017173B1 (en) 2017-12-22 2021-05-25 Snap Inc. Named entity recognition visual context and caption data
US11023514B2 (en) 2016-02-26 2021-06-01 Snap Inc. Methods and systems for generation, curation, and presentation of media collections
US11030787B2 (en) 2017-10-30 2021-06-08 Snap Inc. Mobile-based cartographic control of display content
US11037372B2 (en) 2017-03-06 2021-06-15 Snap Inc. Virtual vision system
US11044393B1 (en) 2016-06-20 2021-06-22 Pipbin, Inc. System for curation and display of location-dependent augmented reality content in an augmented estate system
US11128715B1 (en) 2019-12-30 2021-09-21 Snap Inc. Physical friend proximity in chat
US11163941B1 (en) 2018-03-30 2021-11-02 Snap Inc. Annotating a collection of media content items
US11170393B1 (en) 2017-04-11 2021-11-09 Snap Inc. System to calculate an engagement score of location based media content
US11182383B1 (en) 2012-02-24 2021-11-23 Placed, Llc System and method for data collection to validate location data
US11189299B1 (en) 2017-02-20 2021-11-30 Snap Inc. Augmented reality speech balloon system
US11199957B1 (en) 2018-11-30 2021-12-14 Snap Inc. Generating customized avatars based on location information
US11201981B1 (en) 2016-06-20 2021-12-14 Pipbin, Inc. System for notification of user accessibility of curated location-dependent content in an augmented estate
US11206615B2 (en) 2019-05-30 2021-12-21 Snap Inc. Wearable device location systems
US11218838B2 (en) 2019-10-31 2022-01-04 Snap Inc. Focused map-based context information surfacing
US11216869B2 (en) 2014-09-23 2022-01-04 Snap Inc. User interface to augment an image using geolocation
US11228551B1 (en) 2020-02-12 2022-01-18 Snap Inc. Multiple gateway message exchange
US11232040B1 (en) 2017-04-28 2022-01-25 Snap Inc. Precaching unlockable data elements
US11249614B2 (en) 2019-03-28 2022-02-15 Snap Inc. Generating personalized map interface with enhanced icons
US11250075B1 (en) 2017-02-17 2022-02-15 Snap Inc. Searching social media content
US11265273B1 (en) 2017-12-01 2022-03-01 Snap, Inc. Dynamic media overlay with smart widget
US11290851B2 (en) 2020-06-15 2022-03-29 Snap Inc. Location sharing using offline and online objects
US11294936B1 (en) 2019-01-30 2022-04-05 Snap Inc. Adaptive spatial density based clustering
US11301117B2 (en) 2019-03-08 2022-04-12 Snap Inc. Contextual information in chat
US11314776B2 (en) 2020-06-15 2022-04-26 Snap Inc. Location sharing using friend list versions
US11343323B2 (en) 2019-12-31 2022-05-24 Snap Inc. Augmented reality objects registry
US11361493B2 (en) 2019-04-01 2022-06-14 Snap Inc. Semantic texture mapping system
US11388226B1 (en) 2015-01-13 2022-07-12 Snap Inc. Guided personal identity based actions
US11429618B2 (en) 2019-12-30 2022-08-30 Snap Inc. Surfacing augmented reality objects
US11430091B2 (en) 2020-03-27 2022-08-30 Snap Inc. Location mapping for large scale augmented-reality
US11455082B2 (en) 2018-09-28 2022-09-27 Snap Inc. Collaborative achievement interface
US11475254B1 (en) 2017-09-08 2022-10-18 Snap Inc. Multimodal entity identification
US11483267B2 (en) 2020-06-15 2022-10-25 Snap Inc. Location sharing using different rate-limited links
US11503432B2 (en) 2020-06-15 2022-11-15 Snap Inc. Scalable real-time location sharing framework
US11500525B2 (en) 2019-02-25 2022-11-15 Snap Inc. Custom media overlay system
US11507614B1 (en) 2018-02-13 2022-11-22 Snap Inc. Icon based tagging
US11516167B2 (en) 2020-03-05 2022-11-29 Snap Inc. Storing data based on device location
US11558709B2 (en) 2018-11-30 2023-01-17 Snap Inc. Position service to determine relative position to map features
US11574431B2 (en) 2019-02-26 2023-02-07 Snap Inc. Avatar based on weather
US11601783B2 (en) 2019-06-07 2023-03-07 Snap Inc. Detection of a physical collision between two client devices in a location sharing system
US11601888B2 (en) 2021-03-29 2023-03-07 Snap Inc. Determining location using multi-source geolocation data
US11606755B2 (en) 2019-05-30 2023-03-14 Snap Inc. Wearable device location systems architecture
US11616745B2 (en) 2017-01-09 2023-03-28 Snap Inc. Contextual generation and selection of customized media content
US11619501B2 (en) 2020-03-11 2023-04-04 Snap Inc. Avatar based on trip
US11625443B2 (en) 2014-06-05 2023-04-11 Snap Inc. Web document enhancement
US11631276B2 (en) 2016-03-31 2023-04-18 Snap Inc. Automated avatar generation
US11645324B2 (en) 2021-03-31 2023-05-09 Snap Inc. Location-based timeline media content system
US11676378B2 (en) 2020-06-29 2023-06-13 Snap Inc. Providing travel-based augmented reality content with a captured image
US11675831B2 (en) 2017-05-31 2023-06-13 Snap Inc. Geolocation based playlists
US11714535B2 (en) 2019-07-11 2023-08-01 Snap Inc. Edge gesture interface with smart interactions
US11729343B2 (en) 2019-12-30 2023-08-15 Snap Inc. Including video feed in message thread
US11734712B2 (en) 2012-02-24 2023-08-22 Foursquare Labs, Inc. Attributing in-store visits to media consumption based on data collected from user devices
US11751015B2 (en) 2019-01-16 2023-09-05 Snap Inc. Location-based context information sharing in a messaging system
US11776256B2 (en) 2020-03-27 2023-10-03 Snap Inc. Shared augmented reality system
US11785161B1 (en) 2016-06-20 2023-10-10 Pipbin, Inc. System for user accessibility of tagged curated augmented reality content
US11799811B2 (en) 2018-10-31 2023-10-24 Snap Inc. Messaging and gaming applications communication platform
US11809624B2 (en) 2019-02-13 2023-11-07 Snap Inc. Sleep detection in a location sharing system
US11816853B2 (en) 2016-08-30 2023-11-14 Snap Inc. Systems and methods for simultaneous localization and mapping
US11821742B2 (en) 2019-09-26 2023-11-21 Snap Inc. Travel based notifications
US11829834B2 (en) 2021-10-29 2023-11-28 Snap Inc. Extended QR code
US11842411B2 (en) 2017-04-27 2023-12-12 Snap Inc. Location-based virtual avatars
US11843456B2 (en) 2016-10-24 2023-12-12 Snap Inc. Generating and displaying customized avatars in media overlays
US11852554B1 (en) 2019-03-21 2023-12-26 Snap Inc. Barometer calibration in a location sharing system
US11860888B2 (en) 2018-05-22 2024-01-02 Snap Inc. Event detection system
US11868414B1 (en) 2019-03-14 2024-01-09 Snap Inc. Graph-based prediction for contact suggestion in a location sharing system
US11870743B1 (en) 2017-01-23 2024-01-09 Snap Inc. Customized digital avatar accessories
US11877211B2 (en) 2019-01-14 2024-01-16 Snap Inc. Destination sharing in location sharing system
US11876941B1 (en) 2016-06-20 2024-01-16 Pipbin, Inc. Clickable augmented reality content manager, system, and network
US11893208B2 (en) 2019-12-31 2024-02-06 Snap Inc. Combined map icon with action indicator
US11900418B2 (en) 2016-04-04 2024-02-13 Snap Inc. Mutable geo-fencing system
US11925869B2 (en) 2012-05-08 2024-03-12 Snap Inc. System and method for generating and displaying avatars
US11943192B2 (en) 2020-08-31 2024-03-26 Snap Inc. Co-location connection service
US11972529B2 (en) 2019-02-01 2024-04-30 Snap Inc. Augmented reality system

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5776725B2 (en) * 2013-05-14 2015-09-09 コニカミノルタ株式会社 Image processing cooperation system, portable terminal device, image processing cooperation method, and image processing cooperation program
JP5903417B2 (en) * 2013-09-13 2016-04-13 京セラドキュメントソリューションズ株式会社 Electronic device and device setting program
US10165130B2 (en) * 2014-02-13 2018-12-25 Emerge Print Management, Llc System and method for the passive monitoring and reporting of printer-related data on USB cables
JP6387334B2 (en) * 2015-09-24 2018-09-05 東芝テック株式会社 Mobile terminal and program
JP2018005545A (en) * 2016-07-01 2018-01-11 富士ゼロックス株式会社 Information processing device and program
JP6544350B2 (en) 2016-12-28 2019-07-17 京セラドキュメントソリューションズ株式会社 Image formation system
JP6733717B2 (en) * 2018-10-04 2020-08-05 カシオ計算機株式会社 Communication device and program
WO2022070233A1 (en) * 2020-09-29 2022-04-07 日本電気株式会社 Communication control apparatus, operation terminal, device operation system, communication control method, operation terminal control method, and program

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0970021A (en) 1995-09-01 1997-03-11 Noosaido:Kk Video transmitter-receiver
JP2003022229A (en) 2001-07-06 2003-01-24 Minolta Co Ltd Information processing system and information processing terminal
JP2005026740A (en) 2003-06-30 2005-01-27 Matsushita Electric Ind Co Ltd Method of building device control interface
JP2006091390A (en) 2004-09-24 2006-04-06 Mitsubishi Electric Corp Information display system and method, program and information display terminal device for making computer perform information display method
JP2006351024A (en) 2002-05-24 2006-12-28 Olympus Corp Information presentation system of visual field agreement type, and portable information terminal for use in the same
JP2007111921A (en) 2005-10-18 2007-05-10 Konica Minolta Business Technologies Inc Image forming apparatus and program
US20080171573A1 (en) * 2007-01-11 2008-07-17 Samsung Electronics Co., Ltd. Personalized service method using user history in mobile terminal and system using the method
US7516421B2 (en) 2002-05-24 2009-04-07 Olympus Corporation Information presentation system of visual field agreement type, and portable information terminal and server for use in the system
JP2009098903A (en) 2007-10-16 2009-05-07 Fuji Xerox Co Ltd Information equipment system
JP2010088032A (en) 2008-10-02 2010-04-15 Nec Corp Remote control system
US20100182435A1 (en) * 2009-01-21 2010-07-22 Yoshihiro Machida Video information control apparatus and method
US20110244919A1 (en) * 2010-03-19 2011-10-06 Aller Joshua V Methods and Systems for Determining Image Processing Operations Relevant to Particular Imagery
US8095174B2 (en) * 2007-02-06 2012-01-10 Nec Corporation Cellular phone, method for customizing cellular phone and program for customizing cellular phone

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4553364B2 (en) * 2005-02-18 2010-09-29 キヤノン株式会社 Printing system
KR101147748B1 (en) * 2005-05-26 2012-05-25 엘지전자 주식회사 A mobile telecommunication device having a geographic information providing function and the method thereof
CN101228502A (en) * 2005-06-29 2008-07-23 诺基亚公司 More intelligent printing

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0970021A (en) 1995-09-01 1997-03-11 Noosaido:Kk Video transmitter-receiver
JP2003022229A (en) 2001-07-06 2003-01-24 Minolta Co Ltd Information processing system and information processing terminal
US7516421B2 (en) 2002-05-24 2009-04-07 Olympus Corporation Information presentation system of visual field agreement type, and portable information terminal and server for use in the system
JP2006351024A (en) 2002-05-24 2006-12-28 Olympus Corp Information presentation system of visual field agreement type, and portable information terminal for use in the same
JP2005026740A (en) 2003-06-30 2005-01-27 Matsushita Electric Ind Co Ltd Method of building device control interface
JP2006091390A (en) 2004-09-24 2006-04-06 Mitsubishi Electric Corp Information display system and method, program and information display terminal device for making computer perform information display method
JP2007111921A (en) 2005-10-18 2007-05-10 Konica Minolta Business Technologies Inc Image forming apparatus and program
US20080171573A1 (en) * 2007-01-11 2008-07-17 Samsung Electronics Co., Ltd. Personalized service method using user history in mobile terminal and system using the method
US8095174B2 (en) * 2007-02-06 2012-01-10 Nec Corporation Cellular phone, method for customizing cellular phone and program for customizing cellular phone
JP2009098903A (en) 2007-10-16 2009-05-07 Fuji Xerox Co Ltd Information equipment system
JP2010088032A (en) 2008-10-02 2010-04-15 Nec Corp Remote control system
US20100182435A1 (en) * 2009-01-21 2010-07-22 Yoshihiro Machida Video information control apparatus and method
US20110244919A1 (en) * 2010-03-19 2011-10-06 Aller Joshua V Methods and Systems for Determining Image Processing Operations Relevant to Particular Imagery

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Notice of Grounds of Rejection mailed Aug. 28, 2012, directed to Japanese Application No. 2010-254514; 6 pages.

Cited By (333)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10862951B1 (en) 2007-01-05 2020-12-08 Snap Inc. Real-time display of multiple images
US11588770B2 (en) 2007-01-05 2023-02-21 Snap Inc. Real-time display of multiple images
US10334307B2 (en) 2011-07-12 2019-06-25 Snap Inc. Methods and systems of providing visual content editing functions
US11451856B2 (en) 2011-07-12 2022-09-20 Snap Inc. Providing visual content editing functions
US10999623B2 (en) 2011-07-12 2021-05-04 Snap Inc. Providing visual content editing functions
US11750875B2 (en) 2011-07-12 2023-09-05 Snap Inc. Providing visual content editing functions
US11182383B1 (en) 2012-02-24 2021-11-23 Placed, Llc System and method for data collection to validate location data
US11734712B2 (en) 2012-02-24 2023-08-22 Foursquare Labs, Inc. Attributing in-store visits to media consumption based on data collected from user devices
US11925869B2 (en) 2012-05-08 2024-03-12 Snap Inc. System and method for generating and displaying avatars
US10169924B2 (en) 2012-08-22 2019-01-01 Snaps Media Inc. Augmented reality virtual content platform apparatuses, methods and systems
US9721394B2 (en) 2012-08-22 2017-08-01 Snaps Media, Inc. Augmented reality virtual content platform apparatuses, methods and systems
US9792733B2 (en) 2012-08-22 2017-10-17 Snaps Media, Inc. Augmented reality virtual content platform apparatuses, methods and systems
US10887308B1 (en) 2012-11-08 2021-01-05 Snap Inc. Interactive user-interface to adjust access privileges
US11252158B2 (en) 2012-11-08 2022-02-15 Snap Inc. Interactive user-interface to adjust access privileges
US9882907B1 (en) 2012-11-08 2018-01-30 Snap Inc. Apparatus and method for single action control of social network profile access
US11134046B2 (en) 2013-05-30 2021-09-28 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US10587552B1 (en) 2013-05-30 2020-03-10 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US11509618B2 (en) 2013-05-30 2022-11-22 Snap Inc. Maintaining a message thread with opt-in permanence for entries
US9705831B2 (en) 2013-05-30 2017-07-11 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US11115361B2 (en) 2013-05-30 2021-09-07 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US10439972B1 (en) 2013-05-30 2019-10-08 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US9742713B2 (en) 2013-05-30 2017-08-22 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US9794303B1 (en) 2013-11-26 2017-10-17 Snap Inc. Method and system for integrating real time communication features in applications
US10681092B1 (en) 2013-11-26 2020-06-09 Snap Inc. Method and system for integrating real time communication features in applications
US11546388B2 (en) 2013-11-26 2023-01-03 Snap Inc. Method and system for integrating real time communication features in applications
US9083770B1 (en) 2013-11-26 2015-07-14 Snapchat, Inc. Method and system for integrating real time communication features in applications
US11102253B2 (en) 2013-11-26 2021-08-24 Snap Inc. Method and system for integrating real time communication features in applications
US10069876B1 (en) 2013-11-26 2018-09-04 Snap Inc. Method and system for integrating real time communication features in applications
US9936030B2 (en) 2014-01-03 2018-04-03 Investel Capital Corporation User content sharing system and method with location-based external content integration
US9866999B1 (en) 2014-01-12 2018-01-09 Investment Asset Holdings Llc Location-based messaging
US10080102B1 (en) 2014-01-12 2018-09-18 Investment Asset Holdings Llc Location-based messaging
US10349209B1 (en) 2014-01-12 2019-07-09 Investment Asset Holdings Llc Location-based messaging
US11463393B2 (en) 2014-02-21 2022-10-04 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
US10958605B1 (en) 2014-02-21 2021-03-23 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
US10082926B1 (en) 2014-02-21 2018-09-25 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
US10949049B1 (en) 2014-02-21 2021-03-16 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
US11463394B2 (en) 2014-02-21 2022-10-04 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
US11902235B2 (en) 2014-02-21 2024-02-13 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
US10084735B1 (en) 2014-02-21 2018-09-25 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
US9237202B1 (en) 2014-03-07 2016-01-12 Snapchat, Inc. Content delivery network for ephemeral objects
US9407712B1 (en) 2014-03-07 2016-08-02 Snapchat, Inc. Content delivery network for ephemeral objects
US11743219B2 (en) 2014-05-09 2023-08-29 Snap Inc. Dynamic configuration of application component tiles
US11310183B2 (en) 2014-05-09 2022-04-19 Snap Inc. Dynamic configuration of application component tiles
US9276886B1 (en) 2014-05-09 2016-03-01 Snapchat, Inc. Apparatus and method for dynamically configuring application component tiles
US10817156B1 (en) 2014-05-09 2020-10-27 Snap Inc. Dynamic configuration of application component tiles
US10990697B2 (en) 2014-05-28 2021-04-27 Snap Inc. Apparatus and method for automated privacy protection in distributed images
US9396354B1 (en) 2014-05-28 2016-07-19 Snapchat, Inc. Apparatus and method for automated privacy protection in distributed images
US10572681B1 (en) 2014-05-28 2020-02-25 Snap Inc. Apparatus and method for automated privacy protection in distributed images
US9785796B1 (en) 2014-05-28 2017-10-10 Snap Inc. Apparatus and method for automated privacy protection in distributed images
US11625443B2 (en) 2014-06-05 2023-04-11 Snap Inc. Web document enhancement
US11921805B2 (en) 2014-06-05 2024-03-05 Snap Inc. Web document enhancement
US10524087B1 (en) 2014-06-13 2019-12-31 Snap Inc. Message destination list mechanism
US10182311B2 (en) 2014-06-13 2019-01-15 Snap Inc. Prioritization of messages within a message collection
US9094137B1 (en) 2014-06-13 2015-07-28 Snapchat, Inc. Priority based placement of messages in a geo-location based event gallery
US9113301B1 (en) 2014-06-13 2015-08-18 Snapchat, Inc. Geo-location based event gallery
US9430783B1 (en) 2014-06-13 2016-08-30 Snapchat, Inc. Prioritization of messages within gallery
US10659914B1 (en) 2014-06-13 2020-05-19 Snap Inc. Geo-location based event gallery
US9532171B2 (en) 2014-06-13 2016-12-27 Snap Inc. Geo-location based event gallery
US10623891B2 (en) 2014-06-13 2020-04-14 Snap Inc. Prioritization of messages within a message collection
US11317240B2 (en) 2014-06-13 2022-04-26 Snap Inc. Geo-location based event gallery
US9693191B2 (en) 2014-06-13 2017-06-27 Snap Inc. Prioritization of messages within gallery
US9825898B2 (en) 2014-06-13 2017-11-21 Snap Inc. Prioritization of messages within a message collection
US10779113B2 (en) 2014-06-13 2020-09-15 Snap Inc. Prioritization of messages within a message collection
US10200813B1 (en) 2014-06-13 2019-02-05 Snap Inc. Geo-location based event gallery
US11166121B2 (en) 2014-06-13 2021-11-02 Snap Inc. Prioritization of messages within a message collection
US10448201B1 (en) 2014-06-13 2019-10-15 Snap Inc. Prioritization of messages within a message collection
US11849214B2 (en) * 2014-07-07 2023-12-19 Snap Inc. Apparatus and method for supplying content aware photo filters
US10602057B1 (en) * 2014-07-07 2020-03-24 Snap Inc. Supplying content aware photo filters
US11496673B1 (en) 2014-07-07 2022-11-08 Snap Inc. Apparatus and method for supplying content aware photo filters
US10701262B1 (en) 2014-07-07 2020-06-30 Snap Inc. Apparatus and method for supplying content aware photo filters
US9225897B1 (en) * 2014-07-07 2015-12-29 Snapchat, Inc. Apparatus and method for supplying content aware photo filters
US10432850B1 (en) 2014-07-07 2019-10-01 Snap Inc. Apparatus and method for supplying content aware photo filters
US10154192B1 (en) 2014-07-07 2018-12-11 Snap Inc. Apparatus and method for supplying content aware photo filters
US11122200B2 (en) 2014-07-07 2021-09-14 Snap Inc. Supplying content aware photo filters
US10348960B1 (en) * 2014-07-07 2019-07-09 Snap Inc. Apparatus and method for supplying content aware photo filters
US9407816B1 (en) 2014-07-07 2016-08-02 Snapchat, Inc. Apparatus and method for supplying content aware photo filters
US20230020575A1 (en) * 2014-07-07 2023-01-19 Snap Inc. Apparatus and method for supplying content aware photo filters
US11595569B2 (en) 2014-07-07 2023-02-28 Snap Inc. Supplying content aware photo filters
US11017363B1 (en) 2014-08-22 2021-05-25 Snap Inc. Message processor with application prompts
US10055717B1 (en) 2014-08-22 2018-08-21 Snap Inc. Message processor with application prompts
US11625755B1 (en) 2014-09-16 2023-04-11 Foursquare Labs, Inc. Determining targeting information based on a predictive targeting model
US10423983B2 (en) 2014-09-16 2019-09-24 Snap Inc. Determining targeting information based on a predictive targeting model
US11741136B2 (en) 2014-09-18 2023-08-29 Snap Inc. Geolocation-based pictographs
US11281701B2 (en) 2014-09-18 2022-03-22 Snap Inc. Geolocation-based pictographs
US10824654B2 (en) 2014-09-18 2020-11-03 Snap Inc. Geolocation-based pictographs
US11216869B2 (en) 2014-09-23 2022-01-04 Snap Inc. User interface to augment an image using geolocation
US11038829B1 (en) 2014-10-02 2021-06-15 Snap Inc. Ephemeral gallery of ephemeral messages with opt-in permanence
US10284508B1 (en) 2014-10-02 2019-05-07 Snap Inc. Ephemeral gallery of ephemeral messages with opt-in permanence
US10958608B1 (en) 2014-10-02 2021-03-23 Snap Inc. Ephemeral gallery of visual media messages
US11012398B1 (en) 2014-10-02 2021-05-18 Snap Inc. Ephemeral message gallery user interface with screenshot messages
US10944710B1 (en) 2014-10-02 2021-03-09 Snap Inc. Ephemeral gallery user interface with remaining gallery time indication
US11855947B1 (en) 2014-10-02 2023-12-26 Snap Inc. Gallery of ephemeral messages
US10476830B2 (en) 2014-10-02 2019-11-12 Snap Inc. Ephemeral gallery of ephemeral messages
US20170374003A1 (en) 2014-10-02 2017-12-28 Snapchat, Inc. Ephemeral gallery of ephemeral messages
US10708210B1 (en) 2014-10-02 2020-07-07 Snap Inc. Multi-user ephemeral message gallery
US9537811B2 (en) 2014-10-02 2017-01-03 Snap Inc. Ephemeral gallery of ephemeral messages
US11411908B1 (en) 2014-10-02 2022-08-09 Snap Inc. Ephemeral message gallery user interface with online viewing history indicia
US11522822B1 (en) 2014-10-02 2022-12-06 Snap Inc. Ephemeral gallery elimination based on gallery and message timers
US10616476B1 (en) 2014-11-12 2020-04-07 Snap Inc. User interface for accessing media at a geographic location
US11956533B2 (en) 2014-11-12 2024-04-09 Snap Inc. Accessing media at a geographic location
US11190679B2 (en) 2014-11-12 2021-11-30 Snap Inc. Accessing media at a geographic location
US9843720B1 (en) 2014-11-12 2017-12-12 Snap Inc. User interface for accessing media at a geographic location
US9385983B1 (en) 2014-12-19 2016-07-05 Snapchat, Inc. Gallery of messages from individuals with a shared interest
US11250887B2 (en) 2014-12-19 2022-02-15 Snap Inc. Routing messages by message parameter
US10811053B2 (en) 2014-12-19 2020-10-20 Snap Inc. Routing messages by message parameter
US11783862B2 (en) 2014-12-19 2023-10-10 Snap Inc. Routing messages by message parameter
US11803345B2 (en) 2014-12-19 2023-10-31 Snap Inc. Gallery of messages from individuals with a shared interest
US10311916B2 (en) 2014-12-19 2019-06-04 Snap Inc. Gallery of videos set to an audio time line
US10580458B2 (en) 2014-12-19 2020-03-03 Snap Inc. Gallery of videos set to an audio time line
US11372608B2 (en) 2014-12-19 2022-06-28 Snap Inc. Gallery of messages from individuals with a shared interest
US10514876B2 (en) 2014-12-19 2019-12-24 Snap Inc. Gallery of messages from individuals with a shared interest
US9854219B2 (en) 2014-12-19 2017-12-26 Snap Inc. Gallery of videos set to an audio time line
US10157449B1 (en) 2015-01-09 2018-12-18 Snap Inc. Geo-location-based image filters
US11734342B2 (en) 2015-01-09 2023-08-22 Snap Inc. Object recognition based image overlays
US11301960B2 (en) 2015-01-09 2022-04-12 Snap Inc. Object recognition based image filters
US10380720B1 (en) 2015-01-09 2019-08-13 Snap Inc. Location-based image filters
US11388226B1 (en) 2015-01-13 2022-07-12 Snap Inc. Guided personal identity based actions
US11962645B2 (en) 2015-01-13 2024-04-16 Snap Inc. Guided personal identity based actions
US10416845B1 (en) 2015-01-19 2019-09-17 Snap Inc. Multichannel system
US11249617B1 (en) 2015-01-19 2022-02-15 Snap Inc. Multichannel system
US10133705B1 (en) 2015-01-19 2018-11-20 Snap Inc. Multichannel system
US10932085B1 (en) 2015-01-26 2021-02-23 Snap Inc. Content request by location
US11528579B2 (en) 2015-01-26 2022-12-13 Snap Inc. Content request by location
US10536800B1 (en) 2015-01-26 2020-01-14 Snap Inc. Content request by location
US10123166B2 (en) 2015-01-26 2018-11-06 Snap Inc. Content request by location
US11910267B2 (en) 2015-01-26 2024-02-20 Snap Inc. Content request by location
US10223397B1 (en) 2015-03-13 2019-03-05 Snap Inc. Social graph based co-location of network users
US10893055B2 (en) 2015-03-18 2021-01-12 Snap Inc. Geo-fence authorization provisioning
US11902287B2 (en) 2015-03-18 2024-02-13 Snap Inc. Geo-fence authorization provisioning
US10616239B2 (en) 2015-03-18 2020-04-07 Snap Inc. Geo-fence authorization provisioning
US10948717B1 (en) 2015-03-23 2021-03-16 Snap Inc. Reducing boot time and power consumption in wearable display systems
US11662576B2 (en) 2015-03-23 2023-05-30 Snap Inc. Reducing boot time and power consumption in displaying data content
US11320651B2 (en) 2015-03-23 2022-05-03 Snap Inc. Reducing boot time and power consumption in displaying data content
US11496544B2 (en) 2015-05-05 2022-11-08 Snap Inc. Story and sub-story navigation
US11392633B2 (en) 2015-05-05 2022-07-19 Snap Inc. Systems and methods for automated local story generation and curation
US10911575B1 (en) 2015-05-05 2021-02-02 Snap Inc. Systems and methods for story and sub-story navigation
US10135949B1 (en) 2015-05-05 2018-11-20 Snap Inc. Systems and methods for story and sub-story navigation
US11449539B2 (en) 2015-05-05 2022-09-20 Snap Inc. Automated local story generation and curation
US10592574B2 (en) 2015-05-05 2020-03-17 Snap Inc. Systems and methods for automated local story generation and curation
US10993069B2 (en) 2015-07-16 2021-04-27 Snap Inc. Dynamically adaptive media content delivery
US11961116B2 (en) 2015-08-13 2024-04-16 Foursquare Labs, Inc. Determining exposures to content presented by physical objects
US10817898B2 (en) 2015-08-13 2020-10-27 Placed, Llc Determining exposures to content presented by physical objects
US11769307B2 (en) 2015-10-30 2023-09-26 Snap Inc. Image based tracking in augmented reality systems
US11315331B2 (en) 2015-10-30 2022-04-26 Snap Inc. Image based tracking in augmented reality systems
US10733802B2 (en) 2015-10-30 2020-08-04 Snap Inc. Image based tracking in augmented reality systems
US10366543B1 (en) 2015-10-30 2019-07-30 Snap Inc. Image based tracking in augmented reality systems
US11380051B2 (en) 2015-11-30 2022-07-05 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US11599241B2 (en) 2015-11-30 2023-03-07 Snap Inc. Network resource location linking and visual content sharing
US10474321B2 (en) 2015-11-30 2019-11-12 Snap Inc. Network resource location linking and visual content sharing
US10997783B2 (en) 2015-11-30 2021-05-04 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US11468615B2 (en) 2015-12-18 2022-10-11 Snap Inc. Media overlay publication system
US11830117B2 (en) 2015-12-18 2023-11-28 Snap Inc Media overlay publication system
US10997758B1 (en) 2015-12-18 2021-05-04 Snap Inc. Media overlay publication system
US10354425B2 (en) 2015-12-18 2019-07-16 Snap Inc. Method and system for providing context relevant media augmentation
US11197123B2 (en) 2016-02-26 2021-12-07 Snap Inc. Generation, curation, and presentation of media collections
US10834525B2 (en) 2016-02-26 2020-11-10 Snap Inc. Generation, curation, and presentation of media collections
US11889381B2 (en) 2016-02-26 2024-01-30 Snap Inc. Generation, curation, and presentation of media collections
US11023514B2 (en) 2016-02-26 2021-06-01 Snap Inc. Methods and systems for generation, curation, and presentation of media collections
US10679389B2 (en) 2016-02-26 2020-06-09 Snap Inc. Methods and systems for generation, curation, and presentation of media collections
US11611846B2 (en) 2016-02-26 2023-03-21 Snap Inc. Generation, curation, and presentation of media collections
US11631276B2 (en) 2016-03-31 2023-04-18 Snap Inc. Automated avatar generation
US11900418B2 (en) 2016-04-04 2024-02-13 Snap Inc. Mutable geo-fencing system
US11044393B1 (en) 2016-06-20 2021-06-22 Pipbin, Inc. System for curation and display of location-dependent augmented reality content in an augmented estate system
US11876941B1 (en) 2016-06-20 2024-01-16 Pipbin, Inc. Clickable augmented reality content manager, system, and network
US11201981B1 (en) 2016-06-20 2021-12-14 Pipbin, Inc. System for notification of user accessibility of curated location-dependent content in an augmented estate
US10839219B1 (en) 2016-06-20 2020-11-17 Pipbin, Inc. System for curation, distribution and display of location-dependent augmented reality content
US10638256B1 (en) 2016-06-20 2020-04-28 Pipbin, Inc. System for distribution and display of mobile targeted augmented reality content
US10805696B1 (en) 2016-06-20 2020-10-13 Pipbin, Inc. System for recording and targeting tagged content of user interest
US11785161B1 (en) 2016-06-20 2023-10-10 Pipbin, Inc. System for user accessibility of tagged curated augmented reality content
US10992836B2 (en) 2016-06-20 2021-04-27 Pipbin, Inc. Augmented property system of curated augmented reality media elements
US10165402B1 (en) 2016-06-28 2018-12-25 Snap Inc. System to track engagement of media items
US10430838B1 (en) 2016-06-28 2019-10-01 Snap Inc. Methods and systems for generation, curation, and presentation of media collections with automated advertising
US11445326B2 (en) 2016-06-28 2022-09-13 Snap Inc. Track engagement of media items
US10327100B1 (en) 2016-06-28 2019-06-18 Snap Inc. System to track engagement of media items
US10506371B2 (en) 2016-06-28 2019-12-10 Snap Inc. System to track engagement of media items
US10785597B2 (en) 2016-06-28 2020-09-22 Snap Inc. System to track engagement of media items
US10219110B2 (en) 2016-06-28 2019-02-26 Snap Inc. System to track engagement of media items
US10735892B2 (en) 2016-06-28 2020-08-04 Snap Inc. System to track engagement of media items
US10885559B1 (en) 2016-06-28 2021-01-05 Snap Inc. Generation, curation, and presentation of media collections with automated advertising
US11640625B2 (en) 2016-06-28 2023-05-02 Snap Inc. Generation, curation, and presentation of media collections with automated advertising
US11895068B2 (en) 2016-06-30 2024-02-06 Snap Inc. Automated content curation and communication
US11080351B1 (en) 2016-06-30 2021-08-03 Snap Inc. Automated content curation and communication
US10387514B1 (en) 2016-06-30 2019-08-20 Snap Inc. Automated content curation and communication
US11509615B2 (en) 2016-07-19 2022-11-22 Snap Inc. Generating customized electronic messaging graphics
US10348662B2 (en) 2016-07-19 2019-07-09 Snap Inc. Generating customized electronic messaging graphics
US11816853B2 (en) 2016-08-30 2023-11-14 Snap Inc. Systems and methods for simultaneous localization and mapping
US11843456B2 (en) 2016-10-24 2023-12-12 Snap Inc. Generating and displaying customized avatars in media overlays
US11876762B1 (en) 2016-10-24 2024-01-16 Snap Inc. Generating and displaying customized avatars in media overlays
US10623666B2 (en) 2016-11-07 2020-04-14 Snap Inc. Selective identification and order of image modifiers
US11233952B2 (en) 2016-11-07 2022-01-25 Snap Inc. Selective identification and order of image modifiers
US11750767B2 (en) 2016-11-07 2023-09-05 Snap Inc. Selective identification and order of image modifiers
US10754525B1 (en) 2016-12-09 2020-08-25 Snap Inc. Customized media overlays
US10203855B2 (en) 2016-12-09 2019-02-12 Snap Inc. Customized user-controlled media overlays
US11397517B2 (en) 2016-12-09 2022-07-26 Snap Inc. Customized media overlays
US11616745B2 (en) 2017-01-09 2023-03-28 Snap Inc. Contextual generation and selection of customized media content
US11870743B1 (en) 2017-01-23 2024-01-09 Snap Inc. Customized digital avatar accessories
US10915911B2 (en) 2017-02-03 2021-02-09 Snap Inc. System to determine a price-schedule to distribute media content
US11861795B1 (en) 2017-02-17 2024-01-02 Snap Inc. Augmented reality anamorphosis system
US11720640B2 (en) 2017-02-17 2023-08-08 Snap Inc. Searching social media content
US10319149B1 (en) 2017-02-17 2019-06-11 Snap Inc. Augmented reality anamorphosis system
US11250075B1 (en) 2017-02-17 2022-02-15 Snap Inc. Searching social media content
US11748579B2 (en) 2017-02-20 2023-09-05 Snap Inc. Augmented reality speech balloon system
US11189299B1 (en) 2017-02-20 2021-11-30 Snap Inc. Augmented reality speech balloon system
US11037372B2 (en) 2017-03-06 2021-06-15 Snap Inc. Virtual vision system
US11670057B2 (en) 2017-03-06 2023-06-06 Snap Inc. Virtual vision system
US11961196B2 (en) 2017-03-06 2024-04-16 Snap Inc. Virtual vision system
US10887269B1 (en) 2017-03-09 2021-01-05 Snap Inc. Restricted group content collection
US11258749B2 (en) 2017-03-09 2022-02-22 Snap Inc. Restricted group content collection
US10523625B1 (en) 2017-03-09 2019-12-31 Snap Inc. Restricted group content collection
US11297399B1 (en) 2017-03-27 2022-04-05 Snap Inc. Generating a stitched data stream
US10581782B2 (en) 2017-03-27 2020-03-03 Snap Inc. Generating a stitched data stream
US11558678B2 (en) 2017-03-27 2023-01-17 Snap Inc. Generating a stitched data stream
US10582277B2 (en) 2017-03-27 2020-03-03 Snap Inc. Generating a stitched data stream
US11349796B2 (en) 2017-03-27 2022-05-31 Snap Inc. Generating a stitched data stream
US11170393B1 (en) 2017-04-11 2021-11-09 Snap Inc. System to calculate an engagement score of location based media content
US10387730B1 (en) 2017-04-20 2019-08-20 Snap Inc. Augmented reality typography personalization system
US11195018B1 (en) 2017-04-20 2021-12-07 Snap Inc. Augmented reality typography personalization system
US11451956B1 (en) 2017-04-27 2022-09-20 Snap Inc. Location privacy management on map-based social media platforms
US11392264B1 (en) 2017-04-27 2022-07-19 Snap Inc. Map-based graphical user interface for multi-type social media galleries
US11556221B2 (en) 2017-04-27 2023-01-17 Snap Inc. Friend location sharing mechanism for social media platforms
US11474663B2 (en) 2017-04-27 2022-10-18 Snap Inc. Location-based search mechanism in a graphical user interface
US10952013B1 (en) 2017-04-27 2021-03-16 Snap Inc. Selective location-based identity communication
US11418906B2 (en) 2017-04-27 2022-08-16 Snap Inc. Selective location-based identity communication
US11893647B2 (en) 2017-04-27 2024-02-06 Snap Inc. Location-based virtual avatars
US11842411B2 (en) 2017-04-27 2023-12-12 Snap Inc. Location-based virtual avatars
US11409407B2 (en) 2017-04-27 2022-08-09 Snap Inc. Map-based graphical user interface indicating geospatial activity metrics
US11782574B2 (en) 2017-04-27 2023-10-10 Snap Inc. Map-based graphical user interface indicating geospatial activity metrics
US10963529B1 (en) 2017-04-27 2021-03-30 Snap Inc. Location-based search mechanism in a graphical user interface
US11385763B2 (en) 2017-04-27 2022-07-12 Snap Inc. Map-based graphical user interface indicating geospatial activity metrics
US11232040B1 (en) 2017-04-28 2022-01-25 Snap Inc. Precaching unlockable data elements
US11675831B2 (en) 2017-05-31 2023-06-13 Snap Inc. Geolocation based playlists
US11475254B1 (en) 2017-09-08 2022-10-18 Snap Inc. Multimodal entity identification
US10740974B1 (en) 2017-09-15 2020-08-11 Snap Inc. Augmented reality system
US11721080B2 (en) 2017-09-15 2023-08-08 Snap Inc. Augmented reality system
US11335067B2 (en) 2017-09-15 2022-05-17 Snap Inc. Augmented reality system
US11006242B1 (en) 2017-10-09 2021-05-11 Snap Inc. Context sensitive presentation of content
US10499191B1 (en) 2017-10-09 2019-12-03 Snap Inc. Context sensitive presentation of content
US11617056B2 (en) 2017-10-09 2023-03-28 Snap Inc. Context sensitive presentation of content
US11030787B2 (en) 2017-10-30 2021-06-08 Snap Inc. Mobile-based cartographic control of display content
US11670025B2 (en) 2017-10-30 2023-06-06 Snap Inc. Mobile-based cartographic control of display content
US11558327B2 (en) 2017-12-01 2023-01-17 Snap Inc. Dynamic media overlay with smart widget
US11943185B2 (en) 2017-12-01 2024-03-26 Snap Inc. Dynamic media overlay with smart widget
US11265273B1 (en) 2017-12-01 2022-03-01 Snap, Inc. Dynamic media overlay with smart widget
US11687720B2 (en) 2017-12-22 2023-06-27 Snap Inc. Named entity recognition visual context and caption data
US11017173B1 (en) 2017-12-22 2021-05-25 Snap Inc. Named entity recognition visual context and caption data
US11487794B2 (en) 2018-01-03 2022-11-01 Snap Inc. Tag distribution visualization system
US10678818B2 (en) 2018-01-03 2020-06-09 Snap Inc. Tag distribution visualization system
US11841896B2 (en) 2018-02-13 2023-12-12 Snap Inc. Icon based tagging
US11507614B1 (en) 2018-02-13 2022-11-22 Snap Inc. Icon based tagging
US10979752B1 (en) 2018-02-28 2021-04-13 Snap Inc. Generating media content items based on location information
US10885136B1 (en) 2018-02-28 2021-01-05 Snap Inc. Audience filtering system
US11523159B2 (en) 2018-02-28 2022-12-06 Snap Inc. Generating media content items based on location information
US11722837B2 (en) 2018-03-06 2023-08-08 Snap Inc. Geo-fence selection system
US10327096B1 (en) 2018-03-06 2019-06-18 Snap Inc. Geo-fence selection system
US10524088B2 (en) 2018-03-06 2019-12-31 Snap Inc. Geo-fence selection system
US11044574B2 (en) 2018-03-06 2021-06-22 Snap Inc. Geo-fence selection system
US11570572B2 (en) 2018-03-06 2023-01-31 Snap Inc. Geo-fence selection system
US10933311B2 (en) 2018-03-14 2021-03-02 Snap Inc. Generating collectible items based on location information
US11491393B2 (en) 2018-03-14 2022-11-08 Snap Inc. Generating collectible items based on location information
US11163941B1 (en) 2018-03-30 2021-11-02 Snap Inc. Annotating a collection of media content items
US10219111B1 (en) 2018-04-18 2019-02-26 Snap Inc. Visitation tracking system
US11683657B2 (en) 2018-04-18 2023-06-20 Snap Inc. Visitation tracking system
US10681491B1 (en) 2018-04-18 2020-06-09 Snap Inc. Visitation tracking system
US11297463B2 (en) 2018-04-18 2022-04-05 Snap Inc. Visitation tracking system
US10448199B1 (en) 2018-04-18 2019-10-15 Snap Inc. Visitation tracking system
US10779114B2 (en) 2018-04-18 2020-09-15 Snap Inc. Visitation tracking system
US10924886B2 (en) 2018-04-18 2021-02-16 Snap Inc. Visitation tracking system
US11860888B2 (en) 2018-05-22 2024-01-02 Snap Inc. Event detection system
US11367234B2 (en) 2018-07-24 2022-06-21 Snap Inc. Conditional modification of augmented reality object
US10943381B2 (en) 2018-07-24 2021-03-09 Snap Inc. Conditional modification of augmented reality object
US10679393B2 (en) 2018-07-24 2020-06-09 Snap Inc. Conditional modification of augmented reality object
US11670026B2 (en) 2018-07-24 2023-06-06 Snap Inc. Conditional modification of augmented reality object
US10789749B2 (en) 2018-07-24 2020-09-29 Snap Inc. Conditional modification of augmented reality object
US11450050B2 (en) 2018-08-31 2022-09-20 Snap Inc. Augmented reality anthropomorphization system
US10997760B2 (en) 2018-08-31 2021-05-04 Snap Inc. Augmented reality anthropomorphization system
US11676319B2 (en) 2018-08-31 2023-06-13 Snap Inc. Augmented reality anthropomorphtzation system
US11455082B2 (en) 2018-09-28 2022-09-27 Snap Inc. Collaborative achievement interface
US11704005B2 (en) 2018-09-28 2023-07-18 Snap Inc. Collaborative achievement interface
US11799811B2 (en) 2018-10-31 2023-10-24 Snap Inc. Messaging and gaming applications communication platform
US11698722B2 (en) 2018-11-30 2023-07-11 Snap Inc. Generating customized avatars based on location information
US11199957B1 (en) 2018-11-30 2021-12-14 Snap Inc. Generating customized avatars based on location information
US11812335B2 (en) 2018-11-30 2023-11-07 Snap Inc. Position service to determine relative position to map features
US11558709B2 (en) 2018-11-30 2023-01-17 Snap Inc. Position service to determine relative position to map features
US11877211B2 (en) 2019-01-14 2024-01-16 Snap Inc. Destination sharing in location sharing system
US11751015B2 (en) 2019-01-16 2023-09-05 Snap Inc. Location-based context information sharing in a messaging system
US11294936B1 (en) 2019-01-30 2022-04-05 Snap Inc. Adaptive spatial density based clustering
US11693887B2 (en) 2019-01-30 2023-07-04 Snap Inc. Adaptive spatial density based clustering
US11972529B2 (en) 2019-02-01 2024-04-30 Snap Inc. Augmented reality system
US11809624B2 (en) 2019-02-13 2023-11-07 Snap Inc. Sleep detection in a location sharing system
US11500525B2 (en) 2019-02-25 2022-11-15 Snap Inc. Custom media overlay system
US11954314B2 (en) 2019-02-25 2024-04-09 Snap Inc. Custom media overlay system
US11574431B2 (en) 2019-02-26 2023-02-07 Snap Inc. Avatar based on weather
US11301117B2 (en) 2019-03-08 2022-04-12 Snap Inc. Contextual information in chat
US11868414B1 (en) 2019-03-14 2024-01-09 Snap Inc. Graph-based prediction for contact suggestion in a location sharing system
US11852554B1 (en) 2019-03-21 2023-12-26 Snap Inc. Barometer calibration in a location sharing system
US11740760B2 (en) 2019-03-28 2023-08-29 Snap Inc. Generating personalized map interface with enhanced icons
US11249614B2 (en) 2019-03-28 2022-02-15 Snap Inc. Generating personalized map interface with enhanced icons
US11361493B2 (en) 2019-04-01 2022-06-14 Snap Inc. Semantic texture mapping system
US11206615B2 (en) 2019-05-30 2021-12-21 Snap Inc. Wearable device location systems
US11606755B2 (en) 2019-05-30 2023-03-14 Snap Inc. Wearable device location systems architecture
US11963105B2 (en) 2019-05-30 2024-04-16 Snap Inc. Wearable device location systems architecture
US11785549B2 (en) 2019-05-30 2023-10-10 Snap Inc. Wearable device location systems
US11917495B2 (en) 2019-06-07 2024-02-27 Snap Inc. Detection of a physical collision between two client devices in a location sharing system
US11601783B2 (en) 2019-06-07 2023-03-07 Snap Inc. Detection of a physical collision between two client devices in a location sharing system
US11714535B2 (en) 2019-07-11 2023-08-01 Snap Inc. Edge gesture interface with smart interactions
US11821742B2 (en) 2019-09-26 2023-11-21 Snap Inc. Travel based notifications
US11218838B2 (en) 2019-10-31 2022-01-04 Snap Inc. Focused map-based context information surfacing
US11429618B2 (en) 2019-12-30 2022-08-30 Snap Inc. Surfacing augmented reality objects
US11729343B2 (en) 2019-12-30 2023-08-15 Snap Inc. Including video feed in message thread
US11128715B1 (en) 2019-12-30 2021-09-21 Snap Inc. Physical friend proximity in chat
US11943303B2 (en) 2019-12-31 2024-03-26 Snap Inc. Augmented reality objects registry
US11893208B2 (en) 2019-12-31 2024-02-06 Snap Inc. Combined map icon with action indicator
US11343323B2 (en) 2019-12-31 2022-05-24 Snap Inc. Augmented reality objects registry
US11228551B1 (en) 2020-02-12 2022-01-18 Snap Inc. Multiple gateway message exchange
US11888803B2 (en) 2020-02-12 2024-01-30 Snap Inc. Multiple gateway message exchange
US11765117B2 (en) 2020-03-05 2023-09-19 Snap Inc. Storing data based on device location
US11516167B2 (en) 2020-03-05 2022-11-29 Snap Inc. Storing data based on device location
US11619501B2 (en) 2020-03-11 2023-04-04 Snap Inc. Avatar based on trip
US11776256B2 (en) 2020-03-27 2023-10-03 Snap Inc. Shared augmented reality system
US11915400B2 (en) 2020-03-27 2024-02-27 Snap Inc. Location mapping for large scale augmented-reality
US11430091B2 (en) 2020-03-27 2022-08-30 Snap Inc. Location mapping for large scale augmented-reality
US11314776B2 (en) 2020-06-15 2022-04-26 Snap Inc. Location sharing using friend list versions
US11290851B2 (en) 2020-06-15 2022-03-29 Snap Inc. Location sharing using offline and online objects
US11503432B2 (en) 2020-06-15 2022-11-15 Snap Inc. Scalable real-time location sharing framework
US11483267B2 (en) 2020-06-15 2022-10-25 Snap Inc. Location sharing using different rate-limited links
US11676378B2 (en) 2020-06-29 2023-06-13 Snap Inc. Providing travel-based augmented reality content with a captured image
US11943192B2 (en) 2020-08-31 2024-03-26 Snap Inc. Co-location connection service
US11601888B2 (en) 2021-03-29 2023-03-07 Snap Inc. Determining location using multi-source geolocation data
US11606756B2 (en) 2021-03-29 2023-03-14 Snap Inc. Scheduling requests for location data
US11902902B2 (en) 2021-03-29 2024-02-13 Snap Inc. Scheduling requests for location data
US11645324B2 (en) 2021-03-31 2023-05-09 Snap Inc. Location-based timeline media content system
US11972014B2 (en) 2021-04-19 2024-04-30 Snap Inc. Apparatus and method for automated privacy protection in distributed images
US11829834B2 (en) 2021-10-29 2023-11-28 Snap Inc. Extended QR code

Also Published As

Publication number Publication date
CN102469232B (en) 2014-08-13
JP5212448B2 (en) 2013-06-19
US20120120259A1 (en) 2012-05-17
JP2012105235A (en) 2012-05-31
CN102469232A (en) 2012-05-23

Similar Documents

Publication Publication Date Title
US8471914B2 (en) Image processing system with ease of operation
US9128644B2 (en) Image processing system including an image processing apparatus and a portable terminal
US20120182432A1 (en) Image processing system including portable terminal
US9304728B2 (en) Generating a map of image forming devices on a mobile device
US9652183B2 (en) Image processing system for identifying image processing apparatuses using a terminal apparatus
US20160306596A1 (en) Terminal, information processing apparatus, image forming system, and non-transitory computer readable medium
US9258440B2 (en) Image forming apparatus, remote operation device, remote control method, remote operation method, non-transitory computer-readable recording medium encoded with remote control program, and non-transitory computer-readable recording medium encoded with remote operation program for performing remote operation
JP5696489B2 (en) Server apparatus, printing system, and printing method
JP2016009228A (en) Handheld terminal, handheld terminal control program, and network input/output system
JP6265717B2 (en) Information processing apparatus, control method for information processing apparatus, and program
JP2012194649A (en) Image processing system
JP2018005295A (en) Program and mobile terminal
JP2012104036A (en) Image processing system, control method of image processing apparatus, portable terminal, information processing apparatus and control program
US10728418B2 (en) Remote control system method, and program for image processing apparatus
JP5811722B2 (en) Image processing system, server, control method, and control program
US9191546B2 (en) Non-transitory computer-readable recording medium storing computer-readable instructions for information processing apparatus, information processing apparatus, and method for controlling information processing apparatus
JP5673121B2 (en) Server apparatus, printing system, and printing method
JP5780081B2 (en) Image processing system, server, display method, and control program
US9648177B2 (en) Remote control apparatus, remote control method, and non-transitory computer-readable recording medium encoded with remote control program
JP6975414B2 (en) Programs and mobile terminals
JP7035124B2 (en) Information processing equipment, control methods, and programs
JP6023032B2 (en) Mobile terminal, device management system, and device management program
JP5494469B2 (en) Image processing system, information processing apparatus, and portable terminal control program
JP6155968B2 (en) Image forming apparatus, remote operation method notification method, and remote operation method notification program
JP5994615B2 (en) Remote control system, portable information device, remote control method, and remote control program

Legal Events

Date Code Title Description
AS Assignment

Owner name: KONICA MINOLTA BUSINESS TECHNOLOGIES, INC., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SAKIYAMA, DAISUKE;MORIKAWA, TAKESHI;MINAMI, TAKESHI;AND OTHERS;REEL/FRAME:027270/0699

Effective date: 20111024

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8