US20150116529A1 - Automatic effect method for photography and electronic apparatus - Google Patents

Automatic effect method for photography and electronic apparatus Download PDF

Info

Publication number
US20150116529A1
US20150116529A1 US14/272,513 US201414272513A US2015116529A1 US 20150116529 A1 US20150116529 A1 US 20150116529A1 US 201414272513 A US201414272513 A US 201414272513A US 2015116529 A1 US2015116529 A1 US 2015116529A1
Authority
US
United States
Prior art keywords
effect
image data
photography
camera set
electronic apparatus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/272,513
Inventor
Jing-Lung Wu
Hsin-Ti Chueh
Fu-Chang Tseng
Pol-Lin Tai
Yu-Cheng Hsu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HTC Corp
Original Assignee
HTC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HTC Corp filed Critical HTC Corp
Priority to US14/272,513 priority Critical patent/US20150116529A1/en
Priority to DE201410010152 priority patent/DE102014010152A1/en
Assigned to HTC CORPORATION reassignment HTC CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHUEH, HSIN-TI, HSU, YU-CHENG, TAI, POL-LIN, TSENG, FU-CHANG, WU, JING-LUNG
Priority to TW103124395A priority patent/TWI549503B/en
Priority to CN201410362346.6A priority patent/CN104580878B/en
Publication of US20150116529A1 publication Critical patent/US20150116529A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • H04N5/23222
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance

Definitions

  • the invention relates to a photography method/device. More particularly, the invention relates to a method of determining a suitable photograph effect and a device thereof.
  • Photography used to be a professional job, because it requires much knowledge in order to determine suitable configurations (e.g., controlling an exposure time, a white balance, a focal distance) for shooting a photo properly.
  • suitable configurations e.g., controlling an exposure time, a white balance, a focal distance
  • Most digital cameras have a variety of photography modes, e.g., smart capture, portrait, sport, dynamic, landscape, close-up, sunset, backlight, children, bright, self-portrait, night portrait, night landscape, high-ISO and panorama, which can be selected by the user, in order to set up the digital cameras into a proper status in advance before capturing photos.
  • photography modes e.g., smart capture, portrait, sport, dynamic, landscape, close-up, sunset, backlight, children, bright, self-portrait, night portrait, night landscape, high-ISO and panorama, which can be selected by the user, in order to set up the digital cameras into a proper status in advance before capturing photos.
  • the photography mode can be selected from an operational menu displayed on the digital camera or by manipulating function keys implemented on the digital camera.
  • An aspect of the disclosure is to provide an electronic apparatus.
  • the electronic apparatus includes a camera set, an input source module and an auto-engine module.
  • the camera set is configured for capturing image data.
  • the input source module is configured for gathering information related to the image data.
  • the auto-engine module is configured for determining at least one suitable photography effect from a plurality of candidate photography effects according to the information related to the image data.
  • the information includes a focusing distance of the camera set related to the image data.
  • Another aspect of the disclosure is to provide a method, suitable for an electronic apparatus with a camera set.
  • the method includes steps of: capturing image data by the camera set; gathering information related to the image data, the information comprising a focusing distance of the camera set related to the image data; and, determining at least one suitable photography effect from a plurality of candidate photography effects according to the information related to the image data.
  • Another aspect of the disclosure is to provide a non-transitory computer readable storage medium with a computer program to execute an automatic effect method.
  • the automatic effect method includes steps of: in response to image data are captured, gathering information related to the image data, the information comprising a focusing distance of the camera set related to the image data; and, determining at least one suitable photography effect from a plurality of candidate photography effects according to the information related to the image data.
  • FIG. 1 is a schematic diagram illustrating an electronic apparatus according to an embodiment of this disclosure
  • FIG. 2 is a flow-chart diagram illustrating an automatic effect method utilized by the electronic apparatus in an illustrational example according to an embodiment
  • FIG. 3 is a flow-chart diagram illustrating an automatic effect method utilized by the electronic apparatus in another illustrational example according to an embodiment
  • FIG. 4A , FIG. 4B , FIG. 4C and FIG. 4D are examples of depth histograms corresponding to different depth distributions.
  • FIG. 5 is a method for providing a user interface according to an embodiment of the disclosure.
  • An embodiment of the disclosure is to introduce a method for automatically determining corresponding photography effects (e.g., an optical-like effect to change aperture, focus and depth of field on the image data by software simulation) based on various information, such as a focusing distance (acquired from a position of a voice coil motor), RGB histograms, a depth histogram and an image disparity.
  • a photography effects e.g., an optical-like effect to change aperture, focus and depth of field on the image data by software simulation
  • various information such as a focusing distance (acquired from a position of a voice coil motor), RGB histograms, a depth histogram and an image disparity.
  • FIG. 1 is a schematic diagram illustrating an electronic apparatus 100 according to an embodiment of this disclosure.
  • the electronic apparatus 100 includes a camera set 120 , an input source module 140 and an auto-engine module 160 .
  • the electronic apparatus 100 further includes a post usage module 180 and a pre-processing module 150 .
  • the pre-processing module 150 is coupled with the input source module 140 and the auto-engine module 160 .
  • the camera set 120 includes a camera module 122 and a focusing module 124 .
  • the camera module 122 is configured for capturing the image data.
  • the camera module 122 can be a singular camera unit, a pair of camera units (e.g., an implementation of dual cameras) or plural camera units (an implementation of multiple cameras).
  • the camera module 122 includes two camera units 122 a and 122 b.
  • the camera module 122 is configured for capturing image data relative to a scene.
  • the image data can be processed and stored as a photo(s) on the electronic apparatus 100 .
  • two image data are individually captured by two camera units 122 a and 122 b and the two image data can be processed and stored as two photos on the electronic apparatus 100 .
  • the focusing module 124 is configured for regulating the focusing distance utilized by the camera module 122 .
  • the focusing module 124 includes a first focusing 124 a and a second focusing 124 b corresponding to the camera units 122 a and 122 b respectively.
  • the first focusing 124 a regulates a first focusing distance of the camera unit 122 a
  • the second focusing 124 b regulates a second focusing distance of the camera unit 122 b.
  • the focusing distance is a specific distance between a target object of the scene and the camera module 122 .
  • each of the first focusing 124 a and the second focusing 124 b includes a voice coil motor (VCM) for regulating a focal length of the camera unit 122 a / 122 b in correspondence to the focusing distance.
  • the focal, length means a distance between lens and a sensing array (e.g., a CCD/CMOS optical sensing array) within the camera unit 122 a / 122 b of the camera module 122 .
  • the first focusing distance and the second focusing distance are regulated separately, such that the camera units 122 a and 122 b are capable to focus on different target objects (e.g., a person at the foreground and a building at the background) at the same time within the target scene.
  • different target objects e.g., a person at the foreground and a building at the background
  • the first focusing distance and the second focusing distance are synchronized to be the same, such that the two image data outputted from the camera units 122 a and 122 b can show the same target observed from slight different visional angles, and the image data captured in this case are useful for establishing depth information or simulating 3D effects.
  • the input source module 140 is configured for gathering information related to the image data.
  • the information related to the image data includes the focusing distance(s).
  • the input source module 140 acquires the focusing distance(s) from the focusing module 124 (e.g., according to a position of the voice coil motor).
  • the electronic apparatus 100 further includes a depth engine 190 , which is configured for analyzing a depth distribution of the image data relative to the scene.
  • depth information could be obtained by such as, but not limited to, analysis result of images of single camera, dual cameras, multiple cameras or a single camera with distance detecting sensor such as laser sensors, infrared ray (IR) sensors, or light pattern sensors.
  • the depth distribution for example, can be represented by a depth histogram or depth map. In the depth histogram, pixels within the image data are classified by their depth value, such that various objects (in the scene of the captured image data) located at different distances from the electronic apparatus 100 can be distinguished by the depth histogram.
  • the depth distribution can be utilized to analyze main subject, edges of objects, spatial relationships between objects, the foreground and the background in the scene.
  • the information related to the image data gathered by the input source module 140 further includes the depth distribution from the depth engine 190 and aforesaid relative analysis results (e.g. main subject, edges of objects, spatial relationships between objects, the foreground and the background in the scene) from the depth distribution.
  • aforesaid relative analysis results e.g. main subject, edges of objects, spatial relationships between objects, the foreground and the background in the scene
  • the information gathered by the input source module 140 further includes sensor information of the camera set 120 , image characteristic information of the image data, system information of the electronic apparatus 100 and other related information.
  • the sensor information includes camera configurations of the camera module 122 (e.g., the camera module 122 is formed by single, dual or multiple camera units), automatic focus (AF) settings, automatic exposure (AE) settings and automatic white-balance (AWB) settings.
  • camera configurations of the camera module 122 e.g., the camera module 122 is formed by single, dual or multiple camera units
  • AF automatic focus
  • AE automatic exposure
  • ABB automatic white-balance
  • the image characteristic information of the image data includes analyzed results from the image data (e.g., scene detection outputs, face number detection outputs, and other detection outputs indicating portrait, group, or people position) and exchangeable image file format (EXIF) data related to the captured image data.
  • image data e.g., scene detection outputs, face number detection outputs, and other detection outputs indicating portrait, group, or people position
  • EXIF exchangeable image file format
  • the system information includes a positioning location (e.g., GPS coordinates) and a system time of the electronic apparatus.
  • Aforesaid other related information can be histograms in Red, Green and Blue colors (RGB histograms), a brightness histogram to indicate the scene for light status (low light, flash light), a backlight module status, an over-exposure notification, a variation of frame intervals and/or a global shifting of the camera module 122 .
  • aforesaid related information can be outputs from an Image Signal Processor (ISP) of the electronic apparatus 100 , not shown in FIG. 1 .
  • ISP Image Signal Processor
  • Aforesaid information related to the image data fine hiding the focusing distance, the depth distribution, the sensor information, the system information and/or other related information) can be gathered by the input source module 140 and stored along with the image data in the electronic apparatus 100 .
  • the gathered and stored information in the embodiment is not limited to affect the parameters/configurations of the camera set 120 directly.
  • the gathered and stored information can be utilized to determine one or more suitable photography effect, which is appropriate or optimal related to the image data, from plural candidate photography effects by the auto-engine module 160 after the image data is captured.
  • the auto-engine module 160 is configured for determining and recommending at least one suitable photography effect from the candidate photography effects according to the information gathered by the input source module 140 and related to the image data.
  • the candidate photography effects includes at least one effect selected from the group including bokeh effect, re focus effect, macro effect, pseudo-3D effect, 3D-alike effect, 3D effect and a flyview animation effect.
  • the pre-processing module 150 is configured to determine whether the captured image data is valid to apply any of the candidate photography effects or not according to the image characteristic information, before the auto-engine module 160 is activated for determining and recommending the suitable photography effect. When the pre-processing module 150 detects that the captured image data is invalid to apply any candidate photography effect, the auto-engine module 160 is suspended from further computation, so as to prevent the auto-engine module 160 from useless computation.
  • the pre-processing module 150 in the embodiment determines whether the image data can apply the photography effects according to the EXIF data.
  • the EXIF data include dual image information corresponding to a pair of photos of the image data (from the dual camera units), time stamps corresponding to the pair of photos, and focusing distances of the pair of photos.
  • the dual image information indicates whether the pair of photos is captured by the dual camera units (e.g., two camera units in dual-cameras configuration).
  • the dual image information will be valid when the pair of photos is captured by the dual camera units.
  • the dual image information will be void when the pair of photos is captured by a singular camera, or by different cameras which are not configured in the dual-cameras configuration.
  • the pre-processing module 150 fails to find any two related photos captured by dual camera units from the EXIF data, such that the image data is not valid to apply the photography effect designed for dual camera units.
  • the post usage module 180 is configured for processing the image data and applying the suitable photography effect to the image data after the image data are captured. For example, when user reviews images/photos existed in a digital album of the electronic apparatus 100 , the auto-engine module 160 can recommend a list of suitable photography effects for each image/photo in the digital album. The suitable photography effects can be displayed, highlighted or enlarged in a user interface (not shown in figures) displayed on the electronic apparatus 100 . Or in another case, the photography effects which are not suitable for a specific image/photo can be faded out or hidden from a list of the photography effects. Users can select at least one effect from the recommend list shown in the user interface. Accordingly, the post usage module 180 can apply one of the suitable photography effects to the existed image data and then display in the user interface if user selects any of the recommended effects from the recommended list (including all of the suitable photography effects).
  • apparatus 100 may automatically apply a default photography effect (e.g., a random effect from the suitable photography effects, or a specific effect from the suitable photography effects).
  • a default photography effect e.g., a random effect from the suitable photography effects, or a specific effect from the suitable photography effects.
  • an effect selected by the user may be applied to the images/photos shown in the digital album automatically. If the user re-selects another effect from the recommended list, a latest effect selected by the user will be applied to the images/photos.
  • the bokeh effect is to generate a blur area within the original image data so as to simulate that the blur area is out-of-focus while image capturing.
  • the refocus effect is to re-assign a focusing distance or an in-focus subject within the original image data so as to simulate the image data under another focusing distance.
  • the image/photo applied the refocus effect provides capability for user to re-assign the focusing point, e.g., by touching/pointing on touch screen of the electronic apparatus 100 , on a specific object of scene.
  • the pseudo-3D or 3D-alike (also known as 2.5D) effect is to generate a series of images (or scenes) to simulate the appearance of being 3D images by 2D graphical projections and similar techniques.
  • the macro effect is to create 3D mesh on a specific object of the original image data in the scene to simulate capturing images through 3D viewing from different angles.
  • the flyview animation effect is to separate an object and a background in the scene and generate a simulation animation, in which the object is observed by different view angles along a moving pattern. Since there are many prior arts discussing how the aforesaid effects are produced, the technical detail of generating the aforesaid effects is skipped in here.
  • FIG. 2 is a flow-chart diagram illustrating an automatic effect method 200 utilized by the electronic apparatus 100 in an illustrational example according to an embodiment.
  • operation S 200 is executed for capturing image data by the camera set 120 .
  • Operation S 202 is executed for gathering information related to the image data.
  • the information includes a focusing distance of the camera set related to the image data.
  • Operation S 204 is executed for comparing the focusing distance with a predefined reference.
  • some of the candidate photography effects are regarded to be possible candidates when the focusing distance is shorter than the predefined reference.
  • the macro effect, the pseudo-3D effect, the 3D-alike effect, the 3D effect and the flyview animation effect from the candidate photography effects are possible candidates when focusing distance is shorter than the predefined reference, because the subject within the scene will be large and vivid enough for aforesaid effects when the focusing distance is short.
  • the macro effect, the pseudo-3D effect, the 3D-alike effect, the 3D effect or the flyview animation effect form a first sub-group within all of the candidate photography effects.
  • the operation S 206 is executed for selecting a suitable one from the first sub-group of the candidate photography effects as the suitable photography effect.
  • some of the candidate photography effects are regarded to be possible candidates when the focusing distance is longer than the predefined reference.
  • the bokeh effect and the refocus effect from the candidate photography effects are possible candidates when focusing distance is longer than the predefined reference, because objects in the foreground and other objects in the background are easy to be separated when the focusing distance is long, such that the image data in this case is good for aforesaid effects.
  • the bokeh effect and the refocus effect form a second sub-group within all of the candidate photography effects.
  • the operation S 208 is executed for selecting a suitable one from the second sub-group of the candidate photography effects as the suitable photography effect.
  • FIG. 3 is a flow-chart diagram illustrating an automatic effect method 300 utilized by the electronic apparatus 100 in another illustrational example according to an embodiment.
  • the auto-engine module 160 determines and recommends the suitable photography effect or a parameter thereof according to the depth distribution in addition to the focusing distance and the information related to the image data.
  • the parameter includes a sharpness level or a contrast strength level (applied, on the bokeh effect and the refocus effect).
  • FIG. 4A , FIG. 4B , FIG. 4C and FIG. 40 are examples of depth histograms corresponding to different depth distributions.
  • FIG. 4A shows a depth histogram DH 1 , which indicates that there are at least two main objects in the image data. At least one of them is located at the foreground, and at least the other is located at the background.
  • FIG. 4B shows another depth histogram DH 2 , which indicates that there are several objects distributed evenly at different, distances from the electronic apparatus 100 .
  • FIG. 4C shows another depth histogram DH 3 , which indicates that there are objects gathered at the far end from the electronic apparatus 100 .
  • FIG. 4D shows another depth histogram DH 4 , which indicates that there are objects gathered at the near end adjacent to the electronic apparatus 100 .
  • operation S 300 , S 302 and S 304 is same as operation S 200 , S 202 and S 204 respectively.
  • operation S 306 is further executed for determining the depth histogram DH of the image data. If the depth histogram DH of the image data is similar to the depth histogram DIM shown in FIG. 4D , operation S 310 is executed for selecting the flyview animation effect, the pseudo-3D effect or the 3D-alike effect as the suitable photography effect, because the main object of the image data is obvious in this situation.
  • operation S 312 is executed for selecting the macro effect, the pseudo-3D effect or the 3D-alike effect as the suitable photography effect, because there are many objects in the image data.
  • operation S 308 is further executed for determining the depth histogram DH of the image data. If the depth histogram DH of the image data is similar to the depth histogram DH 1 shown in FIG. 4A , operation S 314 is executed for selecting and applying the bokeh effect and refocus effect at a sharp level, which means the high contrast strength level of bokeh effect, because two main objects are located at the foreground and the background in the image data.
  • operation S 316 is executed for selecting and applying the bokeh effect and refocus effect at a smooth level, which means the low contrast strength level of bokeh effect, because there are many objects are located at different distances in the image data.
  • the bokeh effect is not suitable here, because objects are all located at the far end in the image data.
  • the auto-engine module 160 is not limited to select the suitable photography effect according to FIG. 2 or FIG. 3 .
  • the auto-engine module 160 can determine the suitable photography effect according to all information gathered by the input source module 140 .
  • the depth distribution is utilized to know subject locations, distances, ranges, spatial relationships. Based on the depth distribution, the subject of the image data is easy to find out according to the depth boundary.
  • the depth distribution also reveals the contents/compositions of the image data.
  • the focusing distance from the voice coil motor (VCM) and other relation information e.g. from the image signal processor (ISP)
  • IFSP image signal processor
  • the system information reveals the time, location, in/out-door of the image data.
  • system information from a Global Positioning System (GPS) of the electronic apparatus 100 can indicate the image data is taken in-door or out-door, or near a famous location.
  • the GPS coordinates can hint what object of image user would like to emphasize according to the location of images taken such as indoor or outdoor.
  • System information from a gravity-sensor, a gyro-sensor or a motion sensor of the electronic apparatus 100 can indicate a capturing posture, a shooting angle or a stable degree while shooting, which is related to compensation or effect.
  • the electronic apparatus 100 further includes a display panel 110 (as shown in FIG. 1 ).
  • the display panel 110 is configured for displaying photos within the image data and also displaying a selectable user interface for selecting the at least one suitable photography effect related to the photos.
  • the display panel 110 is coupled with the auto-engine module 160 and the post usage module 180 , but this disclosure is not limited to this.
  • FIG. 5 is a method 500 for providing a user interface on the display panel 110 according to an embodiment of the disclosure.
  • step S 500 is executed for capturing image data by the camera set.
  • step S 502 is executed for gathering information related to the image data.
  • Step S 504 is executed for determining and recommending at least one suitable photography effect from a plurality of candidate photography effects according to the information related to the image data.
  • steps S 500 to S 504 are explained in details in aforesaid embodiments and can be referred to steps S 200 to S 208 in FIG. 2 and steps S 300 to S 316 in FIG. 3 , and not to be repeated here.
  • the method 500 further executes step S 508 for displaying at least one selectable user interface for selecting one from at least one suitable photography effect related to the image data.
  • the selectable user interface shows some icons or functional bottoms corresponding to different photography effects.
  • the icons or functional bottoms of the recommended/suitable photography effects can be highlighted or arranged/ranked at the high priority.
  • the icons or functional bottoms not in the recommended/suitable list can be grayed out, deactivated or hidden.
  • the method 500 further executes step S 506 for automatically applying at least one of suitable photography effects as a default photography effect to photos shown in a digital album of the electronic apparatus.
  • the method 500 further executes step S 510 for automatically applying the latest selected one of the recommended photography effects to the photos shown in a digital album of the electronic apparatus.
  • the disclosure introduces an electronic apparatus and a method for automatically determining corresponding photography effects based on various information, such as a focusing distance (acquired from a position of a voice coil motor), RGB histograms, a depth histogram, sensor information, system information and/or an image disparity.
  • Another embodiment of the disclosure provides a non-transitory computer readable storage medium with a computer program to execute an automatic effect method disclosed in aforesaid embodiments.
  • the automatic effect method includes steps of: when image data are captured, gathering information related to the image data, the information comprising a focusing distance of the camera set related to the image data; and, determining and recommending at least one suitable photography effect from a plurality of candidate photography effects according to the information related to the image data. Details of the automatic effect method are described in aforesaid embodiments as shown in FIG. 2 and FIG. 3 , and not to be repeated here,
  • Coupled may also be termed as “electrically coupled”, and the term “connected” may be termed as “electrically connected”. “Coupled” and “connected” may also be used to indicate that two or more elements cooperate or interact with each other. It will be understood that, although the terms “first,” “second,” etc., may be used herein to describe various elements, these elements should not be limited by these terms. These terms are used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of the embodiments. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.

Abstract

An electronic apparatus includes an camera set, an input source module, an auto-engine module and a post usage module. The camera set is configured for capturing image data relative to a scene. The input source module is configured for gathering information related to the image data. The auto-engine module is configured for determining at least one suitable photography effect from a plurality of candidate photography effects according to the information related to the image data. The post usage module is configured for processing the image data and applying the suitable photography effect to the image data after the image data are captured.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application claims the priority benefit of U.S. Provisional Application Ser. No. 61/896,136, filed Oct. 28, 2013, and No. 61/923,780, filed Jan. 6, 2014, the full disclosures of which are incorporated herein by reference
  • FIELD OF INVENTION
  • The invention relates to a photography method/device. More particularly, the invention relates to a method of determining a suitable photograph effect and a device thereof.
  • BACKGROUND
  • Photography used to be a professional job, because it requires much knowledge in order to determine suitable configurations (e.g., controlling an exposure time, a white balance, a focal distance) for shooting a photo properly. As complexity of manual configurations of photography has increased, required operations and background knowledge of user have increased.
  • Most digital cameras (or a mobile device with a camera module) have a variety of photography modes, e.g., smart capture, portrait, sport, dynamic, landscape, close-up, sunset, backlight, children, bright, self-portrait, night portrait, night landscape, high-ISO and panorama, which can be selected by the user, in order to set up the digital cameras into a proper status in advance before capturing photos.
  • On a digital camera, the photography mode can be selected from an operational menu displayed on the digital camera or by manipulating function keys implemented on the digital camera.
  • SUMMARY
  • An aspect of the disclosure is to provide an electronic apparatus. The electronic apparatus includes a camera set, an input source module and an auto-engine module. The camera set is configured for capturing image data. The input source module is configured for gathering information related to the image data. The auto-engine module is configured for determining at least one suitable photography effect from a plurality of candidate photography effects according to the information related to the image data. The information includes a focusing distance of the camera set related to the image data.
  • Another aspect of the disclosure is to provide a method, suitable for an electronic apparatus with a camera set. The method includes steps of: capturing image data by the camera set; gathering information related to the image data, the information comprising a focusing distance of the camera set related to the image data; and, determining at least one suitable photography effect from a plurality of candidate photography effects according to the information related to the image data.
  • Another aspect of the disclosure is to provide a non-transitory computer readable storage medium with a computer program to execute an automatic effect method. The automatic effect method includes steps of: in response to image data are captured, gathering information related to the image data, the information comprising a focusing distance of the camera set related to the image data; and, determining at least one suitable photography effect from a plurality of candidate photography effects according to the information related to the image data.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Aspects of the present disclosure are best understood from the following detailed description when read with the accompanying figures. It is noted that, in accordance with the standard practice in the industry, various features are not drawn to scale. In fact, the dimensions of the various features may be arbitrarily increased or reduced for clarity of discussion.
  • FIG. 1 is a schematic diagram illustrating an electronic apparatus according to an embodiment of this disclosure;
  • FIG. 2 is a flow-chart diagram illustrating an automatic effect method utilized by the electronic apparatus in an illustrational example according to an embodiment;
  • FIG. 3 is a flow-chart diagram illustrating an automatic effect method utilized by the electronic apparatus in another illustrational example according to an embodiment;
  • FIG. 4A, FIG. 4B, FIG. 4C and FIG. 4D are examples of depth histograms corresponding to different depth distributions.
  • FIG. 5 is a method for providing a user interface according to an embodiment of the disclosure.
  • DETAILED DESCRIPTION
  • The following disclosure provides many different embodiments, or examples, for implementing different features of the invention. Specific examples of components and arrangements are described below to simplify the present disclosure. These are, of course, merely examples and are not intended to be limiting. In addition, the present disclosure may repeat reference numerals and/or letters in the various examples. This repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed.
  • An embodiment of the disclosure is to introduce a method for automatically determining corresponding photography effects (e.g., an optical-like effect to change aperture, focus and depth of field on the image data by software simulation) based on various information, such as a focusing distance (acquired from a position of a voice coil motor), RGB histograms, a depth histogram and an image disparity. As a result, a user can generally capture photos without manually applying the effects, and appropriate photography effects/configurations can be detected automatically and can be applied during post-usage (e.g., when user reviews the photos) in some embodiments. The details of operations are disclosed in following paragraphs.
  • Reference is made to FIG. 1, which is a schematic diagram illustrating an electronic apparatus 100 according to an embodiment of this disclosure. The electronic apparatus 100 includes a camera set 120, an input source module 140 and an auto-engine module 160. In the embodiment show in FIG. 1, the electronic apparatus 100 further includes a post usage module 180 and a pre-processing module 150. The pre-processing module 150 is coupled with the input source module 140 and the auto-engine module 160.
  • The camera set 120 includes a camera module 122 and a focusing module 124. The camera module 122 is configured for capturing the image data. In practices, the camera module 122 can be a singular camera unit, a pair of camera units (e.g., an implementation of dual cameras) or plural camera units (an implementation of multiple cameras). As the embodiment shown in FIG. 1, the camera module 122 includes two camera units 122 a and 122 b. The camera module 122 is configured for capturing image data relative to a scene. The image data can be processed and stored as a photo(s) on the electronic apparatus 100. As the embodiment of present invention, two image data are individually captured by two camera units 122 a and 122 b and the two image data can be processed and stored as two photos on the electronic apparatus 100.
  • The focusing module 124 is configured for regulating the focusing distance utilized by the camera module 122. As the embodiment shown in FIG. 1, the focusing module 124 includes a first focusing 124 a and a second focusing 124 b corresponding to the camera units 122 a and 122 b respectively. For example, the first focusing 124 a regulates a first focusing distance of the camera unit 122 a, and the second focusing 124 b regulates a second focusing distance of the camera unit 122 b.
  • The focusing distance is a specific distance between a target object of the scene and the camera module 122. In an embodiment, each of the first focusing 124 a and the second focusing 124 b includes a voice coil motor (VCM) for regulating a focal length of the camera unit 122 a/122 b in correspondence to the focusing distance. In some embodiments, the focal, length means a distance between lens and a sensing array (e.g., a CCD/CMOS optical sensing array) within the camera unit 122 a/122 b of the camera module 122.
  • In some embodiment, the first focusing distance and the second focusing distance are regulated separately, such that the camera units 122 a and 122 b are capable to focus on different target objects (e.g., a person at the foreground and a building at the background) at the same time within the target scene.
  • In other embodiments, the first focusing distance and the second focusing distance are synchronized to be the same, such that the two image data outputted from the camera units 122 a and 122 b can show the same target observed from slight different visional angles, and the image data captured in this case are useful for establishing depth information or simulating 3D effects.
  • The input source module 140 is configured for gathering information related to the image data. In the embodiment, the information related to the image data includes the focusing distance(s). The input source module 140 acquires the focusing distance(s) from the focusing module 124 (e.g., according to a position of the voice coil motor).
  • In the embodiment shown in FIG. 1, the electronic apparatus 100 further includes a depth engine 190, which is configured for analyzing a depth distribution of the image data relative to the scene. In exemplary embodiment of the present disclosure, depth information could be obtained by such as, but not limited to, analysis result of images of single camera, dual cameras, multiple cameras or a single camera with distance detecting sensor such as laser sensors, infrared ray (IR) sensors, or light pattern sensors. The depth distribution, for example, can be represented by a depth histogram or depth map. In the depth histogram, pixels within the image data are classified by their depth value, such that various objects (in the scene of the captured image data) located at different distances from the electronic apparatus 100 can be distinguished by the depth histogram. In addition, the depth distribution can be utilized to analyze main subject, edges of objects, spatial relationships between objects, the foreground and the background in the scene.
  • In some embodiments, the information related to the image data gathered by the input source module 140 further includes the depth distribution from the depth engine 190 and aforesaid relative analysis results (e.g. main subject, edges of objects, spatial relationships between objects, the foreground and the background in the scene) from the depth distribution.
  • In some embodiments, the information gathered by the input source module 140 further includes sensor information of the camera set 120, image characteristic information of the image data, system information of the electronic apparatus 100 and other related information.
  • The sensor information includes camera configurations of the camera module 122 (e.g., the camera module 122 is formed by single, dual or multiple camera units), automatic focus (AF) settings, automatic exposure (AE) settings and automatic white-balance (AWB) settings.
  • The image characteristic information of the image data includes analyzed results from the image data (e.g., scene detection outputs, face number detection outputs, and other detection outputs indicating portrait, group, or people position) and exchangeable image file format (EXIF) data related to the captured image data.
  • The system information includes a positioning location (e.g., GPS coordinates) and a system time of the electronic apparatus.
  • Aforesaid other related information can be histograms in Red, Green and Blue colors (RGB histograms), a brightness histogram to indicate the scene for light status (low light, flash light), a backlight module status, an over-exposure notification, a variation of frame intervals and/or a global shifting of the camera module 122. In some embodiments, aforesaid related information can be outputs from an Image Signal Processor (ISP) of the electronic apparatus 100, not shown in FIG. 1.
  • Aforesaid information related to the image data fine hiding the focusing distance, the depth distribution, the sensor information, the system information and/or other related information) can be gathered by the input source module 140 and stored along with the image data in the electronic apparatus 100.
  • It is noticed that, the gathered and stored information in the embodiment is not limited to affect the parameters/configurations of the camera set 120 directly. On the other hand, the gathered and stored information can be utilized to determine one or more suitable photography effect, which is appropriate or optimal related to the image data, from plural candidate photography effects by the auto-engine module 160 after the image data is captured.
  • The auto-engine module 160 is configured for determining and recommending at least one suitable photography effect from the candidate photography effects according to the information gathered by the input source module 140 and related to the image data. In some embodiments, the candidate photography effects includes at least one effect selected from the group including bokeh effect, re focus effect, macro effect, pseudo-3D effect, 3D-alike effect, 3D effect and a flyview animation effect.
  • The pre-processing module 150 is configured to determine whether the captured image data is valid to apply any of the candidate photography effects or not according to the image characteristic information, before the auto-engine module 160 is activated for determining and recommending the suitable photography effect. When the pre-processing module 150 detects that the captured image data is invalid to apply any candidate photography effect, the auto-engine module 160 is suspended from further computation, so as to prevent the auto-engine module 160 from useless computation.
  • For example, the pre-processing module 150 in the embodiment determines whether the image data can apply the photography effects according to the EXIF data. In some practical applications, the EXIF data include dual image information corresponding to a pair of photos of the image data (from the dual camera units), time stamps corresponding to the pair of photos, and focusing distances of the pair of photos.
  • The dual image information indicates whether the pair of photos is captured by the dual camera units (e.g., two camera units in dual-cameras configuration). The dual image information will be valid when the pair of photos is captured by the dual camera units. The dual image information will be void when the pair of photos is captured by a singular camera, or by different cameras which are not configured in the dual-cameras configuration.
  • In an embodiment, when a time difference between two time stamps of dual photos is too large (ex., larger than 100 ms), the pair of photos is not valid to apply the photography effect designed for dual camera units.
  • In another embodiment, when there are no valid focusing distances found in the EXIF data, it suggests that the pair of photos fail to focus on specific target, such that the pair of photos is not valid to apply the photography effect designed for dual camera units.
  • In another embodiment, when there is no valid pair of photos (fail to find any two related photos captured by dual camera units), it suggests that the pre-processing module 150 fails to find any two related photos captured by dual camera units from the EXIF data, such that the image data is not valid to apply the photography effect designed for dual camera units.
  • The post usage module 180 is configured for processing the image data and applying the suitable photography effect to the image data after the image data are captured. For example, when user reviews images/photos existed in a digital album of the electronic apparatus 100, the auto-engine module 160 can recommend a list of suitable photography effects for each image/photo in the digital album. The suitable photography effects can be displayed, highlighted or enlarged in a user interface (not shown in figures) displayed on the electronic apparatus 100. Or in another case, the photography effects which are not suitable for a specific image/photo can be faded out or hidden from a list of the photography effects. Users can select at least one effect from the recommend list shown in the user interface. Accordingly, the post usage module 180 can apply one of the suitable photography effects to the existed image data and then display in the user interface if user selects any of the recommended effects from the recommended list (including all of the suitable photography effects).
  • In one embodiment, before any recommended effect is ever selected by user, images/photos shown in the digital album of the electronic, apparatus 100 may automatically apply a default photography effect (e.g., a random effect from the suitable photography effects, or a specific effect from the suitable photography effects). In another embodiment, after one of the recommended effects is selected by user, an effect selected by the user may be applied to the images/photos shown in the digital album automatically. If the user re-selects another effect from the recommended list, a latest effect selected by the user will be applied to the images/photos.
  • The bokeh effect is to generate a blur area within the original image data so as to simulate that the blur area is out-of-focus while image capturing. The refocus effect is to re-assign a focusing distance or an in-focus subject within the original image data so as to simulate the image data under another focusing distance. For example, the image/photo applied the refocus effect provides capability for user to re-assign the focusing point, e.g., by touching/pointing on touch screen of the electronic apparatus 100, on a specific object of scene. The pseudo-3D or 3D-alike (also known as 2.5D) effect is to generate a series of images (or scenes) to simulate the appearance of being 3D images by 2D graphical projections and similar techniques. The macro effect is to create 3D mesh on a specific object of the original image data in the scene to simulate capturing images through 3D viewing from different angles. The flyview animation effect is to separate an object and a background in the scene and generate a simulation animation, in which the object is observed by different view angles along a moving pattern. Since there are many prior arts discussing how the aforesaid effects are produced, the technical detail of generating the aforesaid effects is skipped in here.
  • There are some illustrational examples introduced in following paragraphs for demonstrating how the auto-engine module 160 determines and recommends the suitable photography effect from the candidate photography effects.
  • Reference is also made to FIG. 2, which is a flow-chart diagram illustrating an automatic effect method 200 utilized by the electronic apparatus 100 in an illustrational example according to an embodiment.
  • As shown in FIG. 1 and FIG. 2, operation S200 is executed for capturing image data by the camera set 120. Operation S202 is executed for gathering information related to the image data. In this case, the information includes a focusing distance of the camera set related to the image data. Operation S204 is executed for comparing the focusing distance with a predefined reference.
  • In this embodiment, some of the candidate photography effects are regarded to be possible candidates when the focusing distance is shorter than the predefined reference. For example, the macro effect, the pseudo-3D effect, the 3D-alike effect, the 3D effect and the flyview animation effect from the candidate photography effects are possible candidates when focusing distance is shorter than the predefined reference, because the subject within the scene will be large and vivid enough for aforesaid effects when the focusing distance is short. In this embodiment, the macro effect, the pseudo-3D effect, the 3D-alike effect, the 3D effect or the flyview animation effect form a first sub-group within all of the candidate photography effects. The operation S206 is executed for selecting a suitable one from the first sub-group of the candidate photography effects as the suitable photography effect.
  • In this embodiment, some of the candidate photography effects are regarded to be possible candidates when the focusing distance is longer than the predefined reference. For example, the bokeh effect and the refocus effect from the candidate photography effects are possible candidates when focusing distance is longer than the predefined reference, because objects in the foreground and other objects in the background are easy to be separated when the focusing distance is long, such that the image data in this case is good for aforesaid effects. In this embodiment, the bokeh effect and the refocus effect form a second sub-group within all of the candidate photography effects. The operation S208 is executed for selecting a suitable one from the second sub-group of the candidate photography effects as the suitable photography effect.
  • Reference is also made to FIG. 3, which is a flow-chart diagram illustrating an automatic effect method 300 utilized by the electronic apparatus 100 in another illustrational example according to an embodiment. In the embodiment shown in FIG. 3, the auto-engine module 160 determines and recommends the suitable photography effect or a parameter thereof according to the depth distribution in addition to the focusing distance and the information related to the image data. For example, the parameter includes a sharpness level or a contrast strength level (applied, on the bokeh effect and the refocus effect).
  • Reference is also made to FIG. 4A, FIG. 4B, FIG. 4C and FIG. 40, which are examples of depth histograms corresponding to different depth distributions. FIG. 4A shows a depth histogram DH1, which indicates that there are at least two main objects in the image data. At least one of them is located at the foreground, and at least the other is located at the background. FIG. 4B shows another depth histogram DH2, which indicates that there are several objects distributed evenly at different, distances from the electronic apparatus 100. FIG. 4C shows another depth histogram DH3, which indicates that there are objects gathered at the far end from the electronic apparatus 100. FIG. 4D shows another depth histogram DH4, which indicates that there are objects gathered at the near end adjacent to the electronic apparatus 100.
  • In FIG. 3, operation S300, S302 and S304 is same as operation S200, S202 and S204 respectively. When the focusing distance is shorter than the predefined reference, operation S306 is further executed for determining the depth histogram DH of the image data. If the depth histogram DH of the image data is similar to the depth histogram DIM shown in FIG. 4D, operation S310 is executed for selecting the flyview animation effect, the pseudo-3D effect or the 3D-alike effect as the suitable photography effect, because the main object of the image data is obvious in this situation.
  • When the focusing distance is shorter than the predefined reference and the depth histogram DH of the image data is similar to the depth histogram DH2 shown in FIG. 4B, operation S312 is executed for selecting the macro effect, the pseudo-3D effect or the 3D-alike effect as the suitable photography effect, because there are many objects in the image data.
  • When the focusing distance is longer than the predefined reference, operation S308 is further executed for determining the depth histogram DH of the image data. If the depth histogram DH of the image data is similar to the depth histogram DH1 shown in FIG. 4A, operation S314 is executed for selecting and applying the bokeh effect and refocus effect at a sharp level, which means the high contrast strength level of bokeh effect, because two main objects are located at the foreground and the background in the image data.
  • When the focusing distance is longer than the predefined reference and the depth histogram DH of the image data is similar to the depth histogram DH2 shown in FIG. 4B, operation S316 is executed for selecting and applying the bokeh effect and refocus effect at a smooth level, which means the low contrast strength level of bokeh effect, because there are many objects are located at different distances in the image data.
  • When the focusing distance is longer than the predefined reference and the depth histogram DFS of the image data is similar to the depth histogram DH3 shown in FIG. 4C, the bokeh effect is not suitable here, because objects are all located at the far end in the image data.
  • It is noticed that illustrational examples shown in and FIG. 2 and FIG. 3 are used for demonstration, and the auto-engine module 160 is not limited to select the suitable photography effect according to FIG. 2 or FIG. 3. The auto-engine module 160 can determine the suitable photography effect according to all information gathered by the input source module 140.
  • The depth distribution is utilized to know subject locations, distances, ranges, spatial relationships. Based on the depth distribution, the subject of the image data is easy to find out according to the depth boundary. The depth distribution also reveals the contents/compositions of the image data. The focusing distance from the voice coil motor (VCM) and other relation information (e.g. from the image signal processor (ISP)) reveals the environment conditions. The system information reveals the time, location, in/out-door of the image data. For example, system information from a Global Positioning System (GPS) of the electronic apparatus 100 can indicate the image data is taken in-door or out-door, or near a famous location. The GPS coordinates can hint what object of image user would like to emphasize according to the location of images taken such as indoor or outdoor. System information from a gravity-sensor, a gyro-sensor or a motion sensor of the electronic apparatus 100 can indicate a capturing posture, a shooting angle or a stable degree while shooting, which is related to compensation or effect.
  • In some embodiment, the electronic apparatus 100 further includes a display panel 110 (as shown in FIG. 1). The display panel 110 is configured for displaying photos within the image data and also displaying a selectable user interface for selecting the at least one suitable photography effect related to the photos. In some embodiment, the display panel 110 is coupled with the auto-engine module 160 and the post usage module 180, but this disclosure is not limited to this.
  • Reference is made to FIG. 5, which is a method 500 for providing a user interface on the display panel 110 according to an embodiment of the disclosure. As shown in FIG. 5, step S500 is executed for capturing image data by the camera set. Step S502 is executed for gathering information related to the image data. Step S504 is executed for determining and recommending at least one suitable photography effect from a plurality of candidate photography effects according to the information related to the image data. Aforesaid steps S500 to S504 are explained in details in aforesaid embodiments and can be referred to steps S200 to S208 in FIG. 2 and steps S300 to S316 in FIG. 3, and not to be repeated here.
  • In embodiment, the method 500 further executes step S508 for displaying at least one selectable user interface for selecting one from at least one suitable photography effect related to the image data. The selectable user interface shows some icons or functional bottoms corresponding to different photography effects. The icons or functional bottoms of the recommended/suitable photography effects can be highlighted or arranged/ranked at the high priority. The icons or functional bottoms not in the recommended/suitable list can be grayed out, deactivated or hidden.
  • In addition, before a recommended photography effect (from the suitable photography effects) is selected by the user, the method 500 further executes step S506 for automatically applying at least one of suitable photography effects as a default photography effect to photos shown in a digital album of the electronic apparatus.
  • Furthermore, after the recommended photography effect (from the suitable photography effects) is selected, the method 500 further executes step S510 for automatically applying the latest selected one of the recommended photography effects to the photos shown in a digital album of the electronic apparatus.
  • Based on aforesaid embodiments, the disclosure introduces an electronic apparatus and a method for automatically determining corresponding photography effects based on various information, such as a focusing distance (acquired from a position of a voice coil motor), RGB histograms, a depth histogram, sensor information, system information and/or an image disparity. As a result, a user can generally capture photos without manually applying the effects, and appropriate photography effects/configurations will be detected automatically and applied for the post usage after the image data are captured.
  • Another embodiment of the disclosure provides a non-transitory computer readable storage medium with a computer program to execute an automatic effect method disclosed in aforesaid embodiments. The automatic effect method includes steps of: when image data are captured, gathering information related to the image data, the information comprising a focusing distance of the camera set related to the image data; and, determining and recommending at least one suitable photography effect from a plurality of candidate photography effects according to the information related to the image data. Details of the automatic effect method are described in aforesaid embodiments as shown in FIG. 2 and FIG. 3, and not to be repeated here,
  • In this document, the term “coupled” may also be termed as “electrically coupled”, and the term “connected” may be termed as “electrically connected”. “Coupled” and “connected” may also be used to indicate that two or more elements cooperate or interact with each other. It will be understood that, although the terms “first,” “second,” etc., may be used herein to describe various elements, these elements should not be limited by these terms. These terms are used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of the embodiments. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
  • The foregoing outlines features of several embodiments so that those skilled in the art may better understand the aspects of the present disclosure. Those skilled in the art should appreciate that they may readily use the present disclosure as a basis for designing or modifying other processes and structures for carrying out the same purposes and/or achieving the same advantages of the embodiments introduced herein. Those skilled in the art should also realize that such equivalent constructions do not depart from the spirit and scope of the present disclosure, and that they may make various changes, substitutions, and alterations herein without departing from the spirit and scope of the present disclosure.

Claims (31)

1. An electronic apparatus, comprising:
a camera set, configured for capturing image data; and
a non-transitory computer-readable medium having computer-executable instructions to be executed by the one or more processors for performing a method, comprising:
gathering information related to the image data, wherein the information related to the image data comprises a distance between a target object of a scene and the camera set; and
determining at least one suitable photography effect from a plurality of candidate photography effects according to the distance between the target object of the scene and the camera set.
2. The electronic apparatus of claim 1, wherein the information related to the image data comprises image characteristic information of the image data, wherein the method further comprises:
determining whether the captured image data is valid to apply any of the candidate photography effects or not according to the image characteristic information.
3. The electronic apparatus of claim 2, wherein the image characteristic information of the image data comprises exchangeable image file format (EXIF) data extracted from the image data, the exchangeable image file format (EXIF) data comprises dual image information corresponding to a pair of photos of the image data, time stamps corresponding to the pair of photos and focusing distances of the pair of photos, and the step of determining whether the captured image data is valid comprises:
checking the dual image information, the time stamps or the focusing distances so as to determine whether the captured image data is valid.
4. The electronic apparatus of claim 1, wherein the camera set comprises at least one voice coil motor, and the distance between the target object of the scene and the camera set is acquired by the voice coil motor.
5. The electronic apparatus of claim 1, wherein the camera set comprises dual camera units or a plurality of camera units.
6. The electronic apparatus of claim 1, wherein the candidate photography effects comprises at least one effect selected from the group including bokeh effect, refocus effect, macro effect, pseudo-3D effect, 3D-alike effect, 3D effect and a flyview animation effect.
7. The electronic apparatus of claim 6, wherein, if the distance between the target object of the scene and the camera set is shorter than a predefined reference, the suitable photography effect is substantially selected from the group consisting of macro effect, pseudo-3D effect, 3D-alike effect, 3D effect and a flyview animation effect.
8. The electronic apparatus of claim 6, wherein, if the distance between the largest object of the scene and the camera set is longer than a predefined reference, the suitable photography effect is substantially selected from the group consisting of bokeh effect and refocus effect.
9. The electronic apparatus of claim 1, wherein the method further comprising:
analyzing a depth distribution of the image data relative to the scene;
wherein the information related to the image data further comprises the depth distribution and the step of determining the at least one suitable photography effect comprises:
determining the suitable photography effect or a parameter of the suitable photography effect further according to the depth distribution.
10. The electronic apparatus of claim 1, further comprising:
a display panel, configured for displaying the image data and a selectable user interface, the selectable user interface is configured for recommending a user to select from the at least one suitable photography effect related to the image data;
wherein, after one of the suitable photography effects is selected on the user interface, the selected one of the suitable photography effects is applied to the image data.
11. A method, suitable for an electronic apparatus with a camera set, the method comprising;
capturing image data by the camera set;
gathering information related to the image data, the information comprising a distance between a target object of a scene and the camera set; and
determining at least one suitable photography effect from a plurality of candidate photography effects according to the distance between the target object of the scene and the camera set.
12. The method of claim 11, further comprising:
providing a selectable user interface, the selectable user interface being configured for recommending a user to select from the at least one suitable photography effect related to the image data.
13. The method of claim 12, further comprising:
before one from the at least one suitable photography effect is selected by the user, automatically applying one of suitable photography effects as a default photography effect to the image data shown in a digital album of the electronic apparatus.
14. The method of claim 12, further comprising;
after one from the at least one suitable photography effect is selected by the user, automatically applying the selected photography effect to the image data shown in a digital album of the electronic apparatus.
15. The method of claim 11, wherein the candidate photography effects comprises at least one effect selected from the group Including bokeh effect, refocus effect, macro effect, pseudo-3D effect, 3D-alike effect, 3D effect and a flyview animation effect.
16. The method of claim 15, wherein, if the the distance between the target object of the scene and the camera set is shorter than a predefined reference, the suitable photography effect is substantially selected from the group consisting of macro effect, pseudo-3D effect, 3D-alike effect, 3D effect and a flyview animation effect.
17. The method of claim 15, wherein, if the distance between the target object of the scene and the camera set is longer than a predefined reference, the suitable photography effect is substantially selected from the group consisting of bokeh effect and refocus effect.
18. The method of claim 11, further comprising:
analyzing a depth distribution of the image data, wherein the information related to the image data further comprises the depth distribution, and the suitable photography effect is determined further according to the depth distribution.
19. The method of claim 11, wherein the camera set comprises dual camera units or a plurality of camera units
20. The method of claim 11, wherein the information related to the image data comprises image characteristic information of the image data, the method further comprises:
determine whether the captured image data is valid to apply any of the candidate photography effects or not according to the image characteristic information.
21. The method of claim 11, wherein the image characteristic information of the image data comprises exchangeable image tile format (EXIF) data extracted from the image data, the exchangeable image file format (EXIF) data comprises dual image information corresponding to a pair of photos of the image data, time stamps corresponding to the pair of photos and focusing distances of the pair of photos, and the method further comprises:
checking the dual image information, the time stamps or the focusing distances so as to determine whether the captured image data is valid.
22. The method of claim 11, wherein the camera set comprises at least one voice coil motor, and the distance between the target object of the scene and the camera set is acquired by the voice coil motor.
23-30. (canceled)
31. A method, suitable for an electronic apparatus with a camera set, the method comprising:
capturing image data by the camera set;
gathering information related to the image data, the information comprising a distance between a target object of a scene and the camera set;
analyzing a depth distribution of the image data, wherein the information related to the image data further comprises the depth distribution; and
determining at least one suitable photography effect from a plurality of candidate photography effects according to the distance between the target object of the scene and the camera set and the depth distribution, wherein the at least one suitable photography effect is determined by comparing the distance between the target object of the scene and the camera set with a predefined reference and by comparing the depth distribution with a plurality of predetermined depth distributions.
32. The method of claim 31, further comprising:
providing a selectable user interface, the selectable user interface being configured for recommending a user to select from the at least one suitable photography effect related to the image data.
33. The method of claim 32, further comprising;
before one from the at least one suitable photography effect is selected by the user, automatically applying one of suitable photography effects as a default photography effect to the image data shown in a digital album of the electronic apparatus.
34. The method of claim 32, further comprising:
after one from the at least one suitable photography effect is selected by the user, automatically applying the selected photography effect to the image data shown in a digital album of the electronic apparatus.
35. The method of claim 31, wherein the candidate photography effects comprises at least one effect selected from the group including bokeh effect, refocus effect, macro effect, pseudo-3D effect, 3D-alike effect, 3D effect and a flyview animation effect.
36. The method of claim 35, wherein, if the distance between the target object of the scene and the camera set is shorter than the predefined reference, the suitable photography effect is substantially selected from the group consisting of macro effect, pseudo-3D effect, 3D-alike effect, 3D effect and a flyview animation effect,
37. The method of claim 35, wherein, if the distance between the target object of the scene and the camera set is longer than the predefined reference, the suitable photography effect is substantially selected from the group consisting of bokeh effect and refocus effect.
38. The method of claim 31, wherein the camera set comprises dual camera units or a plurality of camera units.
US14/272,513 2013-10-28 2014-05-08 Automatic effect method for photography and electronic apparatus Abandoned US20150116529A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US14/272,513 US20150116529A1 (en) 2013-10-28 2014-05-08 Automatic effect method for photography and electronic apparatus
DE201410010152 DE102014010152A1 (en) 2013-10-28 2014-07-09 Automatic effect method for photography and electronic device
TW103124395A TWI549503B (en) 2013-10-28 2014-07-16 Electronic apparatus, automatic effect method and non-transitory computer readable storage medium
CN201410362346.6A CN104580878B (en) 2013-10-28 2014-07-28 Electronic device and the method for automatically determining image effect

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201361896136P 2013-10-28 2013-10-28
US201461923780P 2014-01-06 2014-01-06
US14/272,513 US20150116529A1 (en) 2013-10-28 2014-05-08 Automatic effect method for photography and electronic apparatus

Publications (1)

Publication Number Publication Date
US20150116529A1 true US20150116529A1 (en) 2015-04-30

Family

ID=52811781

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/272,513 Abandoned US20150116529A1 (en) 2013-10-28 2014-05-08 Automatic effect method for photography and electronic apparatus

Country Status (4)

Country Link
US (1) US20150116529A1 (en)
CN (1) CN104580878B (en)
DE (1) DE102014010152A1 (en)
TW (1) TWI549503B (en)

Cited By (146)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104967778A (en) * 2015-06-16 2015-10-07 广东欧珀移动通信有限公司 Focusing reminding method and terminal
US9225897B1 (en) * 2014-07-07 2015-12-29 Snapchat, Inc. Apparatus and method for supplying content aware photo filters
US9237202B1 (en) 2014-03-07 2016-01-12 Snapchat, Inc. Content delivery network for ephemeral objects
US9276886B1 (en) 2014-05-09 2016-03-01 Snapchat, Inc. Apparatus and method for dynamically configuring application component tiles
US9396354B1 (en) 2014-05-28 2016-07-19 Snapchat, Inc. Apparatus and method for automated privacy protection in distributed images
WO2017090837A1 (en) 2015-11-24 2017-06-01 Samsung Electronics Co., Ltd. Digital photographing apparatus and method of operating the same
US9705831B2 (en) 2013-05-30 2017-07-11 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US9721394B2 (en) 2012-08-22 2017-08-01 Snaps Media, Inc. Augmented reality virtual content platform apparatuses, methods and systems
US9742713B2 (en) 2013-05-30 2017-08-22 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US9825898B2 (en) 2014-06-13 2017-11-21 Snap Inc. Prioritization of messages within a message collection
US9843720B1 (en) 2014-11-12 2017-12-12 Snap Inc. User interface for accessing media at a geographic location
US20170374003A1 (en) 2014-10-02 2017-12-28 Snapchat, Inc. Ephemeral gallery of ephemeral messages
US9866999B1 (en) 2014-01-12 2018-01-09 Investment Asset Holdings Llc Location-based messaging
US9881094B2 (en) 2015-05-05 2018-01-30 Snap Inc. Systems and methods for automated local story generation and curation
US9936030B2 (en) 2014-01-03 2018-04-03 Investel Capital Corporation User content sharing system and method with location-based external content integration
US20180182141A1 (en) * 2016-12-22 2018-06-28 Facebook, Inc. Dynamic mask application
US10055717B1 (en) 2014-08-22 2018-08-21 Snap Inc. Message processor with application prompts
US10084735B1 (en) 2014-02-21 2018-09-25 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
US10102680B2 (en) 2015-10-30 2018-10-16 Snap Inc. Image based tracking in augmented reality systems
US10123166B2 (en) 2015-01-26 2018-11-06 Snap Inc. Content request by location
US10133705B1 (en) 2015-01-19 2018-11-20 Snap Inc. Multichannel system
US10157449B1 (en) 2015-01-09 2018-12-18 Snap Inc. Geo-location-based image filters
US10165402B1 (en) 2016-06-28 2018-12-25 Snap Inc. System to track engagement of media items
US10203855B2 (en) 2016-12-09 2019-02-12 Snap Inc. Customized user-controlled media overlays
US10219111B1 (en) 2018-04-18 2019-02-26 Snap Inc. Visitation tracking system
US10223397B1 (en) 2015-03-13 2019-03-05 Snap Inc. Social graph based co-location of network users
US10284508B1 (en) 2014-10-02 2019-05-07 Snap Inc. Ephemeral gallery of ephemeral messages with opt-in permanence
US20190166314A1 (en) * 2017-11-30 2019-05-30 International Business Machines Corporation Ortho-selfie distortion correction using multiple sources
US10311916B2 (en) 2014-12-19 2019-06-04 Snap Inc. Gallery of videos set to an audio time line
US10319149B1 (en) 2017-02-17 2019-06-11 Snap Inc. Augmented reality anamorphosis system
US10327096B1 (en) 2018-03-06 2019-06-18 Snap Inc. Geo-fence selection system
US10334307B2 (en) 2011-07-12 2019-06-25 Snap Inc. Methods and systems of providing visual content editing functions
US10348662B2 (en) 2016-07-19 2019-07-09 Snap Inc. Generating customized electronic messaging graphics
US10354425B2 (en) 2015-12-18 2019-07-16 Snap Inc. Method and system for providing context relevant media augmentation
US10387730B1 (en) 2017-04-20 2019-08-20 Snap Inc. Augmented reality typography personalization system
US10387514B1 (en) 2016-06-30 2019-08-20 Snap Inc. Automated content curation and communication
US10423983B2 (en) 2014-09-16 2019-09-24 Snap Inc. Determining targeting information based on a predictive targeting model
US10430838B1 (en) 2016-06-28 2019-10-01 Snap Inc. Methods and systems for generation, curation, and presentation of media collections with automated advertising
US10439972B1 (en) 2013-05-30 2019-10-08 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US10445874B2 (en) * 2015-06-09 2019-10-15 Vehant Technologies Private Limited System and method for detecting a dissimilar object in undercarriage of a vehicle
US10474321B2 (en) 2015-11-30 2019-11-12 Snap Inc. Network resource location linking and visual content sharing
WO2019216630A1 (en) * 2018-05-11 2019-11-14 삼성전자 주식회사 Image editing method and electronic device supporting same
US10499191B1 (en) 2017-10-09 2019-12-03 Snap Inc. Context sensitive presentation of content
CN110581950A (en) * 2018-06-11 2019-12-17 索尼公司 Camera, system and method for selecting camera settings
US10514876B2 (en) 2014-12-19 2019-12-24 Snap Inc. Gallery of messages from individuals with a shared interest
US10523625B1 (en) 2017-03-09 2019-12-31 Snap Inc. Restricted group content collection
CN110663246A (en) * 2017-05-24 2020-01-07 深圳市大疆创新科技有限公司 Method and system for processing images
US10614828B1 (en) 2017-02-20 2020-04-07 Snap Inc. Augmented reality speech balloon system
US10616239B2 (en) 2015-03-18 2020-04-07 Snap Inc. Geo-fence authorization provisioning
US10623666B2 (en) 2016-11-07 2020-04-14 Snap Inc. Selective identification and order of image modifiers
CN111052727A (en) * 2017-08-22 2020-04-21 三星电子株式会社 Electronic device for storing depth information in association with image according to attribute of depth information obtained using image and control method thereof
US10638256B1 (en) 2016-06-20 2020-04-28 Pipbin, Inc. System for distribution and display of mobile targeted augmented reality content
US10657708B1 (en) 2015-11-30 2020-05-19 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US10678818B2 (en) 2018-01-03 2020-06-09 Snap Inc. Tag distribution visualization system
US10679389B2 (en) 2016-02-26 2020-06-09 Snap Inc. Methods and systems for generation, curation, and presentation of media collections
US10679393B2 (en) 2018-07-24 2020-06-09 Snap Inc. Conditional modification of augmented reality object
WO2020145744A1 (en) * 2019-01-11 2020-07-16 엘지전자 주식회사 Camera device and electronic device including same
US10740974B1 (en) 2017-09-15 2020-08-11 Snap Inc. Augmented reality system
US10805696B1 (en) 2016-06-20 2020-10-13 Pipbin, Inc. System for recording and targeting tagged content of user interest
US10817898B2 (en) 2015-08-13 2020-10-27 Placed, Llc Determining exposures to content presented by physical objects
US10824654B2 (en) 2014-09-18 2020-11-03 Snap Inc. Geolocation-based pictographs
US10834525B2 (en) 2016-02-26 2020-11-10 Snap Inc. Generation, curation, and presentation of media collections
US10839219B1 (en) 2016-06-20 2020-11-17 Pipbin, Inc. System for curation, distribution and display of location-dependent augmented reality content
US10862951B1 (en) 2007-01-05 2020-12-08 Snap Inc. Real-time display of multiple images
US10885136B1 (en) 2018-02-28 2021-01-05 Snap Inc. Audience filtering system
US10911575B1 (en) 2015-05-05 2021-02-02 Snap Inc. Systems and methods for story and sub-story navigation
US10915911B2 (en) 2017-02-03 2021-02-09 Snap Inc. System to determine a price-schedule to distribute media content
US10933311B2 (en) 2018-03-14 2021-03-02 Snap Inc. Generating collectible items based on location information
US10952013B1 (en) 2017-04-27 2021-03-16 Snap Inc. Selective location-based identity communication
US10948717B1 (en) 2015-03-23 2021-03-16 Snap Inc. Reducing boot time and power consumption in wearable display systems
US10963529B1 (en) 2017-04-27 2021-03-30 Snap Inc. Location-based search mechanism in a graphical user interface
US10979752B1 (en) 2018-02-28 2021-04-13 Snap Inc. Generating media content items based on location information
US10993069B2 (en) 2015-07-16 2021-04-27 Snap Inc. Dynamically adaptive media content delivery
US10997760B2 (en) 2018-08-31 2021-05-04 Snap Inc. Augmented reality anthropomorphization system
US11017173B1 (en) 2017-12-22 2021-05-25 Snap Inc. Named entity recognition visual context and caption data
US11019279B2 (en) * 2017-10-19 2021-05-25 Paypal, Inc. Digital image filtering and post-capture processing using user specific data
US11023514B2 (en) 2016-02-26 2021-06-01 Snap Inc. Methods and systems for generation, curation, and presentation of media collections
US11030787B2 (en) 2017-10-30 2021-06-08 Snap Inc. Mobile-based cartographic control of display content
US11037372B2 (en) 2017-03-06 2021-06-15 Snap Inc. Virtual vision system
US11044393B1 (en) 2016-06-20 2021-06-22 Pipbin, Inc. System for curation and display of location-dependent augmented reality content in an augmented estate system
US11128715B1 (en) 2019-12-30 2021-09-21 Snap Inc. Physical friend proximity in chat
US11163941B1 (en) 2018-03-30 2021-11-02 Snap Inc. Annotating a collection of media content items
US11170393B1 (en) 2017-04-11 2021-11-09 Snap Inc. System to calculate an engagement score of location based media content
US11182383B1 (en) 2012-02-24 2021-11-23 Placed, Llc System and method for data collection to validate location data
US11199957B1 (en) 2018-11-30 2021-12-14 Snap Inc. Generating customized avatars based on location information
US11201981B1 (en) 2016-06-20 2021-12-14 Pipbin, Inc. System for notification of user accessibility of curated location-dependent content in an augmented estate
US11206615B2 (en) 2019-05-30 2021-12-21 Snap Inc. Wearable device location systems
US11218838B2 (en) 2019-10-31 2022-01-04 Snap Inc. Focused map-based context information surfacing
US11216869B2 (en) 2014-09-23 2022-01-04 Snap Inc. User interface to augment an image using geolocation
US11228551B1 (en) 2020-02-12 2022-01-18 Snap Inc. Multiple gateway message exchange
US11232040B1 (en) 2017-04-28 2022-01-25 Snap Inc. Precaching unlockable data elements
US11250075B1 (en) 2017-02-17 2022-02-15 Snap Inc. Searching social media content
US11249614B2 (en) 2019-03-28 2022-02-15 Snap Inc. Generating personalized map interface with enhanced icons
US11265273B1 (en) 2017-12-01 2022-03-01 Snap, Inc. Dynamic media overlay with smart widget
US11290851B2 (en) 2020-06-15 2022-03-29 Snap Inc. Location sharing using offline and online objects
US11297399B1 (en) 2017-03-27 2022-04-05 Snap Inc. Generating a stitched data stream
US11294936B1 (en) 2019-01-30 2022-04-05 Snap Inc. Adaptive spatial density based clustering
US11301117B2 (en) 2019-03-08 2022-04-12 Snap Inc. Contextual information in chat
US11314776B2 (en) 2020-06-15 2022-04-26 Snap Inc. Location sharing using friend list versions
US11343323B2 (en) 2019-12-31 2022-05-24 Snap Inc. Augmented reality objects registry
US11349796B2 (en) 2017-03-27 2022-05-31 Snap Inc. Generating a stitched data stream
US11361493B2 (en) 2019-04-01 2022-06-14 Snap Inc. Semantic texture mapping system
US11388226B1 (en) 2015-01-13 2022-07-12 Snap Inc. Guided personal identity based actions
US11429618B2 (en) 2019-12-30 2022-08-30 Snap Inc. Surfacing augmented reality objects
US11430091B2 (en) 2020-03-27 2022-08-30 Snap Inc. Location mapping for large scale augmented-reality
US11455082B2 (en) 2018-09-28 2022-09-27 Snap Inc. Collaborative achievement interface
US11475254B1 (en) 2017-09-08 2022-10-18 Snap Inc. Multimodal entity identification
US11483267B2 (en) 2020-06-15 2022-10-25 Snap Inc. Location sharing using different rate-limited links
US11503432B2 (en) 2020-06-15 2022-11-15 Snap Inc. Scalable real-time location sharing framework
US11500525B2 (en) 2019-02-25 2022-11-15 Snap Inc. Custom media overlay system
US11507614B1 (en) 2018-02-13 2022-11-22 Snap Inc. Icon based tagging
US11516167B2 (en) 2020-03-05 2022-11-29 Snap Inc. Storing data based on device location
US11558709B2 (en) 2018-11-30 2023-01-17 Snap Inc. Position service to determine relative position to map features
US11574431B2 (en) 2019-02-26 2023-02-07 Snap Inc. Avatar based on weather
US11601783B2 (en) 2019-06-07 2023-03-07 Snap Inc. Detection of a physical collision between two client devices in a location sharing system
US11601888B2 (en) 2021-03-29 2023-03-07 Snap Inc. Determining location using multi-source geolocation data
US11606755B2 (en) 2019-05-30 2023-03-14 Snap Inc. Wearable device location systems architecture
US11616745B2 (en) 2017-01-09 2023-03-28 Snap Inc. Contextual generation and selection of customized media content
US11619501B2 (en) 2020-03-11 2023-04-04 Snap Inc. Avatar based on trip
US11625443B2 (en) 2014-06-05 2023-04-11 Snap Inc. Web document enhancement
US11631276B2 (en) 2016-03-31 2023-04-18 Snap Inc. Automated avatar generation
US11645324B2 (en) 2021-03-31 2023-05-09 Snap Inc. Location-based timeline media content system
US11676378B2 (en) 2020-06-29 2023-06-13 Snap Inc. Providing travel-based augmented reality content with a captured image
US11675831B2 (en) 2017-05-31 2023-06-13 Snap Inc. Geolocation based playlists
US11714535B2 (en) 2019-07-11 2023-08-01 Snap Inc. Edge gesture interface with smart interactions
US11729343B2 (en) 2019-12-30 2023-08-15 Snap Inc. Including video feed in message thread
US11734712B2 (en) 2012-02-24 2023-08-22 Foursquare Labs, Inc. Attributing in-store visits to media consumption based on data collected from user devices
US11751015B2 (en) 2019-01-16 2023-09-05 Snap Inc. Location-based context information sharing in a messaging system
US11776256B2 (en) 2020-03-27 2023-10-03 Snap Inc. Shared augmented reality system
US11785161B1 (en) 2016-06-20 2023-10-10 Pipbin, Inc. System for user accessibility of tagged curated augmented reality content
US11799811B2 (en) 2018-10-31 2023-10-24 Snap Inc. Messaging and gaming applications communication platform
US11809624B2 (en) 2019-02-13 2023-11-07 Snap Inc. Sleep detection in a location sharing system
US11816853B2 (en) 2016-08-30 2023-11-14 Snap Inc. Systems and methods for simultaneous localization and mapping
US11821742B2 (en) 2019-09-26 2023-11-21 Snap Inc. Travel based notifications
US11829834B2 (en) 2021-10-29 2023-11-28 Snap Inc. Extended QR code
US11842411B2 (en) 2017-04-27 2023-12-12 Snap Inc. Location-based virtual avatars
US11843456B2 (en) 2016-10-24 2023-12-12 Snap Inc. Generating and displaying customized avatars in media overlays
US11852554B1 (en) 2019-03-21 2023-12-26 Snap Inc. Barometer calibration in a location sharing system
US11860888B2 (en) 2018-05-22 2024-01-02 Snap Inc. Event detection system
US11870743B1 (en) 2017-01-23 2024-01-09 Snap Inc. Customized digital avatar accessories
US11868414B1 (en) 2019-03-14 2024-01-09 Snap Inc. Graph-based prediction for contact suggestion in a location sharing system
US11877211B2 (en) 2019-01-14 2024-01-16 Snap Inc. Destination sharing in location sharing system
US11876941B1 (en) 2016-06-20 2024-01-16 Pipbin, Inc. Clickable augmented reality content manager, system, and network
US11893208B2 (en) 2019-12-31 2024-02-06 Snap Inc. Combined map icon with action indicator
US11900418B2 (en) 2016-04-04 2024-02-13 Snap Inc. Mutable geo-fencing system
US11925869B2 (en) 2012-05-08 2024-03-12 Snap Inc. System and method for generating and displaying avatars

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI641264B (en) * 2017-03-30 2018-11-11 晶睿通訊股份有限公司 Image processing system and lens state determination method
KR102551220B1 (en) * 2018-10-12 2023-07-03 삼성전기주식회사 Camera module
CN114902646A (en) * 2019-12-19 2022-08-12 Oppo广东移动通信有限公司 Electronic device, method of controlling electronic device, and computer-readable storage medium
CN114077310B (en) * 2020-08-14 2023-08-25 宏达国际电子股份有限公司 Method and system for providing virtual environment and non-transient computer readable storage medium

Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7102686B1 (en) * 1998-06-05 2006-09-05 Fuji Photo Film Co., Ltd. Image-capturing apparatus having multiple image capturing units
US20060256226A1 (en) * 2003-01-16 2006-11-16 D-Blur Technologies Ltd. Camera with image enhancement functions
US20100103311A1 (en) * 2007-06-06 2010-04-29 Sony Corporation Image processing device, image processing method, and image processing program
US20100220208A1 (en) * 2009-02-27 2010-09-02 Samsung Digital Imaging Co., Ltd. Image processing method and apparatus and digital photographing apparatus using the same
US20110085789A1 (en) * 2009-10-13 2011-04-14 Patrick Campbell Frame Linked 2D/3D Camera System
US20110169825A1 (en) * 2008-09-30 2011-07-14 Fujifilm Corporation Three-dimensional display apparatus, method, and program
US20110211089A1 (en) * 2010-02-26 2011-09-01 Research In Motion Limited Mobile Electronic Device Having Camera With Improved Auto White Balance
US20110317988A1 (en) * 2010-06-28 2011-12-29 Samsung Electro-Mechanics Co., Ltd. Apparatus and method for controlling light intensity of camera
WO2012060182A1 (en) * 2010-11-05 2012-05-10 富士フイルム株式会社 Image processing device, image processing program, image processing method, and storage medium
US20120113300A1 (en) * 2010-11-04 2012-05-10 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US20120147145A1 (en) * 2010-12-09 2012-06-14 Sony Corporation Image processing device, image processing method, and program
US20120314908A1 (en) * 2011-06-07 2012-12-13 Yasutaka Hirasawa Image processing device, method of controlling image processing device, and program for causing computer to execute the same method
US20120320239A1 (en) * 2011-06-14 2012-12-20 Pentax Ricoh Imaging Company, Ltd. Image processing device and image processing method
US20130027587A1 (en) * 2010-05-17 2013-01-31 Sony Corporation Signal processing apparatus, imaging apparatus, signal processing method and program
US20130070116A1 (en) * 2011-09-20 2013-03-21 Sony Corporation Image processing device, method of controlling image processing device and program causing computer to execute the method
US20130147843A1 (en) * 2011-07-19 2013-06-13 Kenji Shimizu Image coding device, integrated circuit thereof, and image coding method
US20130162861A1 (en) * 2011-12-27 2013-06-27 Casio Computer Co., Ltd. Image processing device for generating reconstruction image, image generating method, and storage medium
US20140009585A1 (en) * 2012-07-03 2014-01-09 Woodman Labs, Inc. Image blur based on 3d depth information
US20140098195A1 (en) * 2012-10-09 2014-04-10 Cameron Pace Group Llc Stereo camera system with wide and narrow interocular distance cameras
US8754952B2 (en) * 2006-08-04 2014-06-17 Nikon Corporation Digital camera
US20140233853A1 (en) * 2013-02-19 2014-08-21 Research In Motion Limited Method and system for generating shallow depth of field effect
US20150135124A1 (en) * 2011-12-22 2015-05-14 Zte Corporation Multi-zone interface switching method and device
US20150139533A1 (en) * 2013-11-15 2015-05-21 Htc Corporation Method, electronic device and medium for adjusting depth values
US9204034B2 (en) * 2012-12-27 2015-12-01 Canon Kabushiki Kaisha Image processing apparatus and image processing method

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6301440B1 (en) * 2000-04-13 2001-10-09 International Business Machines Corp. System and method for automatically setting image acquisition controls
JP4492724B2 (en) * 2008-03-25 2010-06-30 ソニー株式会社 Image processing apparatus, image processing method, and program
JP2011073256A (en) * 2009-09-30 2011-04-14 Dainippon Printing Co Ltd Card
CN101840068B (en) * 2010-05-18 2012-01-11 深圳典邦科技有限公司 Head-worn optoelectronic automatic focusing visual aid
JP2011257303A (en) * 2010-06-10 2011-12-22 Olympus Corp Image acquisition device, defect correction device and image acquisition method

Patent Citations (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7102686B1 (en) * 1998-06-05 2006-09-05 Fuji Photo Film Co., Ltd. Image-capturing apparatus having multiple image capturing units
US20060256226A1 (en) * 2003-01-16 2006-11-16 D-Blur Technologies Ltd. Camera with image enhancement functions
US8754952B2 (en) * 2006-08-04 2014-06-17 Nikon Corporation Digital camera
US20100103311A1 (en) * 2007-06-06 2010-04-29 Sony Corporation Image processing device, image processing method, and image processing program
US20110169825A1 (en) * 2008-09-30 2011-07-14 Fujifilm Corporation Three-dimensional display apparatus, method, and program
US20100220208A1 (en) * 2009-02-27 2010-09-02 Samsung Digital Imaging Co., Ltd. Image processing method and apparatus and digital photographing apparatus using the same
US20110085789A1 (en) * 2009-10-13 2011-04-14 Patrick Campbell Frame Linked 2D/3D Camera System
US20110211089A1 (en) * 2010-02-26 2011-09-01 Research In Motion Limited Mobile Electronic Device Having Camera With Improved Auto White Balance
US20130027587A1 (en) * 2010-05-17 2013-01-31 Sony Corporation Signal processing apparatus, imaging apparatus, signal processing method and program
US20110317988A1 (en) * 2010-06-28 2011-12-29 Samsung Electro-Mechanics Co., Ltd. Apparatus and method for controlling light intensity of camera
US20120113300A1 (en) * 2010-11-04 2012-05-10 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US20130235167A1 (en) * 2010-11-05 2013-09-12 Fujifilm Corporation Image processing device, image processing method and storage medium
WO2012060182A1 (en) * 2010-11-05 2012-05-10 富士フイルム株式会社 Image processing device, image processing program, image processing method, and storage medium
US20120147145A1 (en) * 2010-12-09 2012-06-14 Sony Corporation Image processing device, image processing method, and program
US20120314908A1 (en) * 2011-06-07 2012-12-13 Yasutaka Hirasawa Image processing device, method of controlling image processing device, and program for causing computer to execute the same method
US20120320239A1 (en) * 2011-06-14 2012-12-20 Pentax Ricoh Imaging Company, Ltd. Image processing device and image processing method
US8830357B2 (en) * 2011-06-14 2014-09-09 Pentax Ricoh Imaging Company, Ltd. Image processing device and image processing method including a blurring process
US20130147843A1 (en) * 2011-07-19 2013-06-13 Kenji Shimizu Image coding device, integrated circuit thereof, and image coding method
US20130070116A1 (en) * 2011-09-20 2013-03-21 Sony Corporation Image processing device, method of controlling image processing device and program causing computer to execute the method
US20150135124A1 (en) * 2011-12-22 2015-05-14 Zte Corporation Multi-zone interface switching method and device
US20130162779A1 (en) * 2011-12-27 2013-06-27 Casio Computer Co., Ltd. Imaging device, image display method, and storage medium for displaying reconstruction image
US20130162861A1 (en) * 2011-12-27 2013-06-27 Casio Computer Co., Ltd. Image processing device for generating reconstruction image, image generating method, and storage medium
US20140009585A1 (en) * 2012-07-03 2014-01-09 Woodman Labs, Inc. Image blur based on 3d depth information
US20140098195A1 (en) * 2012-10-09 2014-04-10 Cameron Pace Group Llc Stereo camera system with wide and narrow interocular distance cameras
US9204034B2 (en) * 2012-12-27 2015-12-01 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US20140233853A1 (en) * 2013-02-19 2014-08-21 Research In Motion Limited Method and system for generating shallow depth of field effect
US20150139533A1 (en) * 2013-11-15 2015-05-21 Htc Corporation Method, electronic device and medium for adjusting depth values

Cited By (329)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11588770B2 (en) 2007-01-05 2023-02-21 Snap Inc. Real-time display of multiple images
US10862951B1 (en) 2007-01-05 2020-12-08 Snap Inc. Real-time display of multiple images
US10334307B2 (en) 2011-07-12 2019-06-25 Snap Inc. Methods and systems of providing visual content editing functions
US11451856B2 (en) 2011-07-12 2022-09-20 Snap Inc. Providing visual content editing functions
US10999623B2 (en) 2011-07-12 2021-05-04 Snap Inc. Providing visual content editing functions
US11750875B2 (en) 2011-07-12 2023-09-05 Snap Inc. Providing visual content editing functions
US11734712B2 (en) 2012-02-24 2023-08-22 Foursquare Labs, Inc. Attributing in-store visits to media consumption based on data collected from user devices
US11182383B1 (en) 2012-02-24 2021-11-23 Placed, Llc System and method for data collection to validate location data
US11925869B2 (en) 2012-05-08 2024-03-12 Snap Inc. System and method for generating and displaying avatars
US9792733B2 (en) 2012-08-22 2017-10-17 Snaps Media, Inc. Augmented reality virtual content platform apparatuses, methods and systems
US10169924B2 (en) 2012-08-22 2019-01-01 Snaps Media Inc. Augmented reality virtual content platform apparatuses, methods and systems
US9721394B2 (en) 2012-08-22 2017-08-01 Snaps Media, Inc. Augmented reality virtual content platform apparatuses, methods and systems
US9742713B2 (en) 2013-05-30 2017-08-22 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US9705831B2 (en) 2013-05-30 2017-07-11 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US11509618B2 (en) 2013-05-30 2022-11-22 Snap Inc. Maintaining a message thread with opt-in permanence for entries
US10439972B1 (en) 2013-05-30 2019-10-08 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US11134046B2 (en) 2013-05-30 2021-09-28 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US10587552B1 (en) 2013-05-30 2020-03-10 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US11115361B2 (en) 2013-05-30 2021-09-07 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US9936030B2 (en) 2014-01-03 2018-04-03 Investel Capital Corporation User content sharing system and method with location-based external content integration
US9866999B1 (en) 2014-01-12 2018-01-09 Investment Asset Holdings Llc Location-based messaging
US10349209B1 (en) 2014-01-12 2019-07-09 Investment Asset Holdings Llc Location-based messaging
US10080102B1 (en) 2014-01-12 2018-09-18 Investment Asset Holdings Llc Location-based messaging
US10084735B1 (en) 2014-02-21 2018-09-25 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
US10082926B1 (en) 2014-02-21 2018-09-25 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
US10949049B1 (en) 2014-02-21 2021-03-16 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
US11463393B2 (en) 2014-02-21 2022-10-04 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
US10958605B1 (en) 2014-02-21 2021-03-23 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
US11463394B2 (en) 2014-02-21 2022-10-04 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
US11902235B2 (en) 2014-02-21 2024-02-13 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
US9407712B1 (en) 2014-03-07 2016-08-02 Snapchat, Inc. Content delivery network for ephemeral objects
US9237202B1 (en) 2014-03-07 2016-01-12 Snapchat, Inc. Content delivery network for ephemeral objects
US10817156B1 (en) 2014-05-09 2020-10-27 Snap Inc. Dynamic configuration of application component tiles
US11743219B2 (en) 2014-05-09 2023-08-29 Snap Inc. Dynamic configuration of application component tiles
US11310183B2 (en) 2014-05-09 2022-04-19 Snap Inc. Dynamic configuration of application component tiles
US9276886B1 (en) 2014-05-09 2016-03-01 Snapchat, Inc. Apparatus and method for dynamically configuring application component tiles
US10572681B1 (en) 2014-05-28 2020-02-25 Snap Inc. Apparatus and method for automated privacy protection in distributed images
US10990697B2 (en) 2014-05-28 2021-04-27 Snap Inc. Apparatus and method for automated privacy protection in distributed images
US9396354B1 (en) 2014-05-28 2016-07-19 Snapchat, Inc. Apparatus and method for automated privacy protection in distributed images
US9785796B1 (en) 2014-05-28 2017-10-10 Snap Inc. Apparatus and method for automated privacy protection in distributed images
US11921805B2 (en) 2014-06-05 2024-03-05 Snap Inc. Web document enhancement
US11625443B2 (en) 2014-06-05 2023-04-11 Snap Inc. Web document enhancement
US11317240B2 (en) 2014-06-13 2022-04-26 Snap Inc. Geo-location based event gallery
US10659914B1 (en) 2014-06-13 2020-05-19 Snap Inc. Geo-location based event gallery
US10524087B1 (en) 2014-06-13 2019-12-31 Snap Inc. Message destination list mechanism
US10182311B2 (en) 2014-06-13 2019-01-15 Snap Inc. Prioritization of messages within a message collection
US10779113B2 (en) 2014-06-13 2020-09-15 Snap Inc. Prioritization of messages within a message collection
US11166121B2 (en) 2014-06-13 2021-11-02 Snap Inc. Prioritization of messages within a message collection
US10623891B2 (en) 2014-06-13 2020-04-14 Snap Inc. Prioritization of messages within a message collection
US10448201B1 (en) 2014-06-13 2019-10-15 Snap Inc. Prioritization of messages within a message collection
US10200813B1 (en) 2014-06-13 2019-02-05 Snap Inc. Geo-location based event gallery
US9825898B2 (en) 2014-06-13 2017-11-21 Snap Inc. Prioritization of messages within a message collection
US9407816B1 (en) 2014-07-07 2016-08-02 Snapchat, Inc. Apparatus and method for supplying content aware photo filters
US20230020575A1 (en) * 2014-07-07 2023-01-19 Snap Inc. Apparatus and method for supplying content aware photo filters
US10348960B1 (en) * 2014-07-07 2019-07-09 Snap Inc. Apparatus and method for supplying content aware photo filters
US10602057B1 (en) * 2014-07-07 2020-03-24 Snap Inc. Supplying content aware photo filters
US10432850B1 (en) 2014-07-07 2019-10-01 Snap Inc. Apparatus and method for supplying content aware photo filters
US11122200B2 (en) 2014-07-07 2021-09-14 Snap Inc. Supplying content aware photo filters
US9225897B1 (en) * 2014-07-07 2015-12-29 Snapchat, Inc. Apparatus and method for supplying content aware photo filters
US11849214B2 (en) * 2014-07-07 2023-12-19 Snap Inc. Apparatus and method for supplying content aware photo filters
US10701262B1 (en) 2014-07-07 2020-06-30 Snap Inc. Apparatus and method for supplying content aware photo filters
US11496673B1 (en) 2014-07-07 2022-11-08 Snap Inc. Apparatus and method for supplying content aware photo filters
US10154192B1 (en) 2014-07-07 2018-12-11 Snap Inc. Apparatus and method for supplying content aware photo filters
US11595569B2 (en) 2014-07-07 2023-02-28 Snap Inc. Supplying content aware photo filters
US10055717B1 (en) 2014-08-22 2018-08-21 Snap Inc. Message processor with application prompts
US11017363B1 (en) 2014-08-22 2021-05-25 Snap Inc. Message processor with application prompts
US11625755B1 (en) 2014-09-16 2023-04-11 Foursquare Labs, Inc. Determining targeting information based on a predictive targeting model
US10423983B2 (en) 2014-09-16 2019-09-24 Snap Inc. Determining targeting information based on a predictive targeting model
US11741136B2 (en) 2014-09-18 2023-08-29 Snap Inc. Geolocation-based pictographs
US11281701B2 (en) 2014-09-18 2022-03-22 Snap Inc. Geolocation-based pictographs
US10824654B2 (en) 2014-09-18 2020-11-03 Snap Inc. Geolocation-based pictographs
US11216869B2 (en) 2014-09-23 2022-01-04 Snap Inc. User interface to augment an image using geolocation
US10476830B2 (en) 2014-10-02 2019-11-12 Snap Inc. Ephemeral gallery of ephemeral messages
US11012398B1 (en) 2014-10-02 2021-05-18 Snap Inc. Ephemeral message gallery user interface with screenshot messages
US11411908B1 (en) 2014-10-02 2022-08-09 Snap Inc. Ephemeral message gallery user interface with online viewing history indicia
US20170374003A1 (en) 2014-10-02 2017-12-28 Snapchat, Inc. Ephemeral gallery of ephemeral messages
US10958608B1 (en) 2014-10-02 2021-03-23 Snap Inc. Ephemeral gallery of visual media messages
US11855947B1 (en) 2014-10-02 2023-12-26 Snap Inc. Gallery of ephemeral messages
US10284508B1 (en) 2014-10-02 2019-05-07 Snap Inc. Ephemeral gallery of ephemeral messages with opt-in permanence
US10708210B1 (en) 2014-10-02 2020-07-07 Snap Inc. Multi-user ephemeral message gallery
US11522822B1 (en) 2014-10-02 2022-12-06 Snap Inc. Ephemeral gallery elimination based on gallery and message timers
US10944710B1 (en) 2014-10-02 2021-03-09 Snap Inc. Ephemeral gallery user interface with remaining gallery time indication
US11038829B1 (en) 2014-10-02 2021-06-15 Snap Inc. Ephemeral gallery of ephemeral messages with opt-in permanence
US10616476B1 (en) 2014-11-12 2020-04-07 Snap Inc. User interface for accessing media at a geographic location
US9843720B1 (en) 2014-11-12 2017-12-12 Snap Inc. User interface for accessing media at a geographic location
US11190679B2 (en) 2014-11-12 2021-11-30 Snap Inc. Accessing media at a geographic location
US11783862B2 (en) 2014-12-19 2023-10-10 Snap Inc. Routing messages by message parameter
US10311916B2 (en) 2014-12-19 2019-06-04 Snap Inc. Gallery of videos set to an audio time line
US11250887B2 (en) 2014-12-19 2022-02-15 Snap Inc. Routing messages by message parameter
US10811053B2 (en) 2014-12-19 2020-10-20 Snap Inc. Routing messages by message parameter
US10514876B2 (en) 2014-12-19 2019-12-24 Snap Inc. Gallery of messages from individuals with a shared interest
US11803345B2 (en) 2014-12-19 2023-10-31 Snap Inc. Gallery of messages from individuals with a shared interest
US11372608B2 (en) 2014-12-19 2022-06-28 Snap Inc. Gallery of messages from individuals with a shared interest
US10580458B2 (en) 2014-12-19 2020-03-03 Snap Inc. Gallery of videos set to an audio time line
US10157449B1 (en) 2015-01-09 2018-12-18 Snap Inc. Geo-location-based image filters
US11301960B2 (en) 2015-01-09 2022-04-12 Snap Inc. Object recognition based image filters
US10380720B1 (en) 2015-01-09 2019-08-13 Snap Inc. Location-based image filters
US11734342B2 (en) 2015-01-09 2023-08-22 Snap Inc. Object recognition based image overlays
US11388226B1 (en) 2015-01-13 2022-07-12 Snap Inc. Guided personal identity based actions
US11249617B1 (en) 2015-01-19 2022-02-15 Snap Inc. Multichannel system
US10133705B1 (en) 2015-01-19 2018-11-20 Snap Inc. Multichannel system
US10416845B1 (en) 2015-01-19 2019-09-17 Snap Inc. Multichannel system
US11528579B2 (en) 2015-01-26 2022-12-13 Snap Inc. Content request by location
US10932085B1 (en) 2015-01-26 2021-02-23 Snap Inc. Content request by location
US11910267B2 (en) 2015-01-26 2024-02-20 Snap Inc. Content request by location
US10536800B1 (en) 2015-01-26 2020-01-14 Snap Inc. Content request by location
US10123166B2 (en) 2015-01-26 2018-11-06 Snap Inc. Content request by location
US10223397B1 (en) 2015-03-13 2019-03-05 Snap Inc. Social graph based co-location of network users
US11902287B2 (en) 2015-03-18 2024-02-13 Snap Inc. Geo-fence authorization provisioning
US10616239B2 (en) 2015-03-18 2020-04-07 Snap Inc. Geo-fence authorization provisioning
US10893055B2 (en) 2015-03-18 2021-01-12 Snap Inc. Geo-fence authorization provisioning
US11662576B2 (en) 2015-03-23 2023-05-30 Snap Inc. Reducing boot time and power consumption in displaying data content
US11320651B2 (en) 2015-03-23 2022-05-03 Snap Inc. Reducing boot time and power consumption in displaying data content
US10948717B1 (en) 2015-03-23 2021-03-16 Snap Inc. Reducing boot time and power consumption in wearable display systems
US11449539B2 (en) 2015-05-05 2022-09-20 Snap Inc. Automated local story generation and curation
US9881094B2 (en) 2015-05-05 2018-01-30 Snap Inc. Systems and methods for automated local story generation and curation
US10592574B2 (en) 2015-05-05 2020-03-17 Snap Inc. Systems and methods for automated local story generation and curation
US10911575B1 (en) 2015-05-05 2021-02-02 Snap Inc. Systems and methods for story and sub-story navigation
US11496544B2 (en) 2015-05-05 2022-11-08 Snap Inc. Story and sub-story navigation
US11392633B2 (en) 2015-05-05 2022-07-19 Snap Inc. Systems and methods for automated local story generation and curation
US10445874B2 (en) * 2015-06-09 2019-10-15 Vehant Technologies Private Limited System and method for detecting a dissimilar object in undercarriage of a vehicle
CN104967778A (en) * 2015-06-16 2015-10-07 广东欧珀移动通信有限公司 Focusing reminding method and terminal
US10993069B2 (en) 2015-07-16 2021-04-27 Snap Inc. Dynamically adaptive media content delivery
US10817898B2 (en) 2015-08-13 2020-10-27 Placed, Llc Determining exposures to content presented by physical objects
US10366543B1 (en) 2015-10-30 2019-07-30 Snap Inc. Image based tracking in augmented reality systems
US10102680B2 (en) 2015-10-30 2018-10-16 Snap Inc. Image based tracking in augmented reality systems
US11315331B2 (en) 2015-10-30 2022-04-26 Snap Inc. Image based tracking in augmented reality systems
US11769307B2 (en) 2015-10-30 2023-09-26 Snap Inc. Image based tracking in augmented reality systems
US10733802B2 (en) 2015-10-30 2020-08-04 Snap Inc. Image based tracking in augmented reality systems
EP4228276A1 (en) * 2015-11-24 2023-08-16 Samsung Electronics Co., Ltd. Digital photographing apparatus and method of operating the same
EP3968625A1 (en) * 2015-11-24 2022-03-16 Samsung Electronics Co., Ltd. Digital photographing apparatus and method of operating the same
CN114079734A (en) * 2015-11-24 2022-02-22 三星电子株式会社 Digital photographing apparatus and method of operating the same
US11496696B2 (en) * 2015-11-24 2022-11-08 Samsung Electronics Co., Ltd. Digital photographing apparatus including a plurality of optical systems for acquiring images under different conditions and method of operating the same
EP3335416A4 (en) * 2015-11-24 2018-08-15 Samsung Electronics Co., Ltd. Digital photographing apparatus and method of operating the same
WO2017090837A1 (en) 2015-11-24 2017-06-01 Samsung Electronics Co., Ltd. Digital photographing apparatus and method of operating the same
US10657708B1 (en) 2015-11-30 2020-05-19 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US11599241B2 (en) 2015-11-30 2023-03-07 Snap Inc. Network resource location linking and visual content sharing
US10474321B2 (en) 2015-11-30 2019-11-12 Snap Inc. Network resource location linking and visual content sharing
US10997783B2 (en) 2015-11-30 2021-05-04 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US11380051B2 (en) 2015-11-30 2022-07-05 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US11830117B2 (en) 2015-12-18 2023-11-28 Snap Inc Media overlay publication system
US11468615B2 (en) 2015-12-18 2022-10-11 Snap Inc. Media overlay publication system
US10354425B2 (en) 2015-12-18 2019-07-16 Snap Inc. Method and system for providing context relevant media augmentation
US11197123B2 (en) 2016-02-26 2021-12-07 Snap Inc. Generation, curation, and presentation of media collections
US10679389B2 (en) 2016-02-26 2020-06-09 Snap Inc. Methods and systems for generation, curation, and presentation of media collections
US11611846B2 (en) 2016-02-26 2023-03-21 Snap Inc. Generation, curation, and presentation of media collections
US11889381B2 (en) 2016-02-26 2024-01-30 Snap Inc. Generation, curation, and presentation of media collections
US11023514B2 (en) 2016-02-26 2021-06-01 Snap Inc. Methods and systems for generation, curation, and presentation of media collections
US10834525B2 (en) 2016-02-26 2020-11-10 Snap Inc. Generation, curation, and presentation of media collections
US11631276B2 (en) 2016-03-31 2023-04-18 Snap Inc. Automated avatar generation
US11900418B2 (en) 2016-04-04 2024-02-13 Snap Inc. Mutable geo-fencing system
US11044393B1 (en) 2016-06-20 2021-06-22 Pipbin, Inc. System for curation and display of location-dependent augmented reality content in an augmented estate system
US10839219B1 (en) 2016-06-20 2020-11-17 Pipbin, Inc. System for curation, distribution and display of location-dependent augmented reality content
US10805696B1 (en) 2016-06-20 2020-10-13 Pipbin, Inc. System for recording and targeting tagged content of user interest
US10992836B2 (en) 2016-06-20 2021-04-27 Pipbin, Inc. Augmented property system of curated augmented reality media elements
US11876941B1 (en) 2016-06-20 2024-01-16 Pipbin, Inc. Clickable augmented reality content manager, system, and network
US10638256B1 (en) 2016-06-20 2020-04-28 Pipbin, Inc. System for distribution and display of mobile targeted augmented reality content
US11201981B1 (en) 2016-06-20 2021-12-14 Pipbin, Inc. System for notification of user accessibility of curated location-dependent content in an augmented estate
US11785161B1 (en) 2016-06-20 2023-10-10 Pipbin, Inc. System for user accessibility of tagged curated augmented reality content
US10735892B2 (en) 2016-06-28 2020-08-04 Snap Inc. System to track engagement of media items
US10430838B1 (en) 2016-06-28 2019-10-01 Snap Inc. Methods and systems for generation, curation, and presentation of media collections with automated advertising
US10785597B2 (en) 2016-06-28 2020-09-22 Snap Inc. System to track engagement of media items
US10219110B2 (en) 2016-06-28 2019-02-26 Snap Inc. System to track engagement of media items
US11640625B2 (en) 2016-06-28 2023-05-02 Snap Inc. Generation, curation, and presentation of media collections with automated advertising
US11445326B2 (en) 2016-06-28 2022-09-13 Snap Inc. Track engagement of media items
US10165402B1 (en) 2016-06-28 2018-12-25 Snap Inc. System to track engagement of media items
US10506371B2 (en) 2016-06-28 2019-12-10 Snap Inc. System to track engagement of media items
US10885559B1 (en) 2016-06-28 2021-01-05 Snap Inc. Generation, curation, and presentation of media collections with automated advertising
US10327100B1 (en) 2016-06-28 2019-06-18 Snap Inc. System to track engagement of media items
US10387514B1 (en) 2016-06-30 2019-08-20 Snap Inc. Automated content curation and communication
US11895068B2 (en) 2016-06-30 2024-02-06 Snap Inc. Automated content curation and communication
US11080351B1 (en) 2016-06-30 2021-08-03 Snap Inc. Automated content curation and communication
US10348662B2 (en) 2016-07-19 2019-07-09 Snap Inc. Generating customized electronic messaging graphics
US11509615B2 (en) 2016-07-19 2022-11-22 Snap Inc. Generating customized electronic messaging graphics
US11816853B2 (en) 2016-08-30 2023-11-14 Snap Inc. Systems and methods for simultaneous localization and mapping
US11876762B1 (en) 2016-10-24 2024-01-16 Snap Inc. Generating and displaying customized avatars in media overlays
US11843456B2 (en) 2016-10-24 2023-12-12 Snap Inc. Generating and displaying customized avatars in media overlays
US10623666B2 (en) 2016-11-07 2020-04-14 Snap Inc. Selective identification and order of image modifiers
US11750767B2 (en) 2016-11-07 2023-09-05 Snap Inc. Selective identification and order of image modifiers
US11233952B2 (en) 2016-11-07 2022-01-25 Snap Inc. Selective identification and order of image modifiers
US11397517B2 (en) 2016-12-09 2022-07-26 Snap Inc. Customized media overlays
US10203855B2 (en) 2016-12-09 2019-02-12 Snap Inc. Customized user-controlled media overlays
US10754525B1 (en) 2016-12-09 2020-08-25 Snap Inc. Customized media overlays
US20180182141A1 (en) * 2016-12-22 2018-06-28 Facebook, Inc. Dynamic mask application
US10636175B2 (en) * 2016-12-22 2020-04-28 Facebook, Inc. Dynamic mask application
US11443460B2 (en) 2016-12-22 2022-09-13 Meta Platforms, Inc. Dynamic mask application
US11616745B2 (en) 2017-01-09 2023-03-28 Snap Inc. Contextual generation and selection of customized media content
US11870743B1 (en) 2017-01-23 2024-01-09 Snap Inc. Customized digital avatar accessories
US10915911B2 (en) 2017-02-03 2021-02-09 Snap Inc. System to determine a price-schedule to distribute media content
US10319149B1 (en) 2017-02-17 2019-06-11 Snap Inc. Augmented reality anamorphosis system
US11250075B1 (en) 2017-02-17 2022-02-15 Snap Inc. Searching social media content
US11861795B1 (en) 2017-02-17 2024-01-02 Snap Inc. Augmented reality anamorphosis system
US11720640B2 (en) 2017-02-17 2023-08-08 Snap Inc. Searching social media content
US10614828B1 (en) 2017-02-20 2020-04-07 Snap Inc. Augmented reality speech balloon system
US11748579B2 (en) 2017-02-20 2023-09-05 Snap Inc. Augmented reality speech balloon system
US11189299B1 (en) 2017-02-20 2021-11-30 Snap Inc. Augmented reality speech balloon system
US11670057B2 (en) 2017-03-06 2023-06-06 Snap Inc. Virtual vision system
US11037372B2 (en) 2017-03-06 2021-06-15 Snap Inc. Virtual vision system
US10887269B1 (en) 2017-03-09 2021-01-05 Snap Inc. Restricted group content collection
US10523625B1 (en) 2017-03-09 2019-12-31 Snap Inc. Restricted group content collection
US11258749B2 (en) 2017-03-09 2022-02-22 Snap Inc. Restricted group content collection
US11349796B2 (en) 2017-03-27 2022-05-31 Snap Inc. Generating a stitched data stream
US11297399B1 (en) 2017-03-27 2022-04-05 Snap Inc. Generating a stitched data stream
US11558678B2 (en) 2017-03-27 2023-01-17 Snap Inc. Generating a stitched data stream
US11170393B1 (en) 2017-04-11 2021-11-09 Snap Inc. System to calculate an engagement score of location based media content
US11195018B1 (en) 2017-04-20 2021-12-07 Snap Inc. Augmented reality typography personalization system
US10387730B1 (en) 2017-04-20 2019-08-20 Snap Inc. Augmented reality typography personalization system
US11474663B2 (en) 2017-04-27 2022-10-18 Snap Inc. Location-based search mechanism in a graphical user interface
US11556221B2 (en) 2017-04-27 2023-01-17 Snap Inc. Friend location sharing mechanism for social media platforms
US11392264B1 (en) 2017-04-27 2022-07-19 Snap Inc. Map-based graphical user interface for multi-type social media galleries
US10952013B1 (en) 2017-04-27 2021-03-16 Snap Inc. Selective location-based identity communication
US11418906B2 (en) 2017-04-27 2022-08-16 Snap Inc. Selective location-based identity communication
US11451956B1 (en) 2017-04-27 2022-09-20 Snap Inc. Location privacy management on map-based social media platforms
US11893647B2 (en) 2017-04-27 2024-02-06 Snap Inc. Location-based virtual avatars
US11385763B2 (en) 2017-04-27 2022-07-12 Snap Inc. Map-based graphical user interface indicating geospatial activity metrics
US10963529B1 (en) 2017-04-27 2021-03-30 Snap Inc. Location-based search mechanism in a graphical user interface
US11409407B2 (en) 2017-04-27 2022-08-09 Snap Inc. Map-based graphical user interface indicating geospatial activity metrics
US11842411B2 (en) 2017-04-27 2023-12-12 Snap Inc. Location-based virtual avatars
US11782574B2 (en) 2017-04-27 2023-10-10 Snap Inc. Map-based graphical user interface indicating geospatial activity metrics
US11232040B1 (en) 2017-04-28 2022-01-25 Snap Inc. Precaching unlockable data elements
EP3488603B1 (en) * 2017-05-24 2021-07-28 SZ DJI Technology Co., Ltd. Methods and systems for processing an image
CN110663246A (en) * 2017-05-24 2020-01-07 深圳市大疆创新科技有限公司 Method and system for processing images
US11675831B2 (en) 2017-05-31 2023-06-13 Snap Inc. Geolocation based playlists
EP3628121A4 (en) * 2017-08-22 2020-05-13 Samsung Electronics Co., Ltd. Electronic device for storing depth information in connection with image depending on properties of depth information obtained using image and control method thereof
US11025814B2 (en) 2017-08-22 2021-06-01 Samsung Electronics Co., Ltd Electronic device for storing depth information in connection with image depending on properties of depth information obtained using image and control method thereof
CN111052727A (en) * 2017-08-22 2020-04-21 三星电子株式会社 Electronic device for storing depth information in association with image according to attribute of depth information obtained using image and control method thereof
US11475254B1 (en) 2017-09-08 2022-10-18 Snap Inc. Multimodal entity identification
US11721080B2 (en) 2017-09-15 2023-08-08 Snap Inc. Augmented reality system
US11335067B2 (en) 2017-09-15 2022-05-17 Snap Inc. Augmented reality system
US10740974B1 (en) 2017-09-15 2020-08-11 Snap Inc. Augmented reality system
US11006242B1 (en) 2017-10-09 2021-05-11 Snap Inc. Context sensitive presentation of content
US11617056B2 (en) 2017-10-09 2023-03-28 Snap Inc. Context sensitive presentation of content
US10499191B1 (en) 2017-10-09 2019-12-03 Snap Inc. Context sensitive presentation of content
US11570379B2 (en) 2017-10-19 2023-01-31 Paypal, Inc. Digital image filtering and post-capture processing using user specific data
US11019279B2 (en) * 2017-10-19 2021-05-25 Paypal, Inc. Digital image filtering and post-capture processing using user specific data
US11030787B2 (en) 2017-10-30 2021-06-08 Snap Inc. Mobile-based cartographic control of display content
US11670025B2 (en) 2017-10-30 2023-06-06 Snap Inc. Mobile-based cartographic control of display content
US20190166314A1 (en) * 2017-11-30 2019-05-30 International Business Machines Corporation Ortho-selfie distortion correction using multiple sources
US10721419B2 (en) * 2017-11-30 2020-07-21 International Business Machines Corporation Ortho-selfie distortion correction using multiple image sensors to synthesize a virtual image
US11558327B2 (en) 2017-12-01 2023-01-17 Snap Inc. Dynamic media overlay with smart widget
US11265273B1 (en) 2017-12-01 2022-03-01 Snap, Inc. Dynamic media overlay with smart widget
US11017173B1 (en) 2017-12-22 2021-05-25 Snap Inc. Named entity recognition visual context and caption data
US11687720B2 (en) 2017-12-22 2023-06-27 Snap Inc. Named entity recognition visual context and caption data
US10678818B2 (en) 2018-01-03 2020-06-09 Snap Inc. Tag distribution visualization system
US11487794B2 (en) 2018-01-03 2022-11-01 Snap Inc. Tag distribution visualization system
US11507614B1 (en) 2018-02-13 2022-11-22 Snap Inc. Icon based tagging
US11841896B2 (en) 2018-02-13 2023-12-12 Snap Inc. Icon based tagging
US10979752B1 (en) 2018-02-28 2021-04-13 Snap Inc. Generating media content items based on location information
US11523159B2 (en) 2018-02-28 2022-12-06 Snap Inc. Generating media content items based on location information
US10885136B1 (en) 2018-02-28 2021-01-05 Snap Inc. Audience filtering system
US11722837B2 (en) 2018-03-06 2023-08-08 Snap Inc. Geo-fence selection system
US11044574B2 (en) 2018-03-06 2021-06-22 Snap Inc. Geo-fence selection system
US10327096B1 (en) 2018-03-06 2019-06-18 Snap Inc. Geo-fence selection system
US10524088B2 (en) 2018-03-06 2019-12-31 Snap Inc. Geo-fence selection system
US11570572B2 (en) 2018-03-06 2023-01-31 Snap Inc. Geo-fence selection system
US10933311B2 (en) 2018-03-14 2021-03-02 Snap Inc. Generating collectible items based on location information
US11491393B2 (en) 2018-03-14 2022-11-08 Snap Inc. Generating collectible items based on location information
US11163941B1 (en) 2018-03-30 2021-11-02 Snap Inc. Annotating a collection of media content items
US10924886B2 (en) 2018-04-18 2021-02-16 Snap Inc. Visitation tracking system
US10448199B1 (en) 2018-04-18 2019-10-15 Snap Inc. Visitation tracking system
US10779114B2 (en) 2018-04-18 2020-09-15 Snap Inc. Visitation tracking system
US10219111B1 (en) 2018-04-18 2019-02-26 Snap Inc. Visitation tracking system
US11297463B2 (en) 2018-04-18 2022-04-05 Snap Inc. Visitation tracking system
US10681491B1 (en) 2018-04-18 2020-06-09 Snap Inc. Visitation tracking system
US11683657B2 (en) 2018-04-18 2023-06-20 Snap Inc. Visitation tracking system
US11233953B2 (en) 2018-05-11 2022-01-25 Samsung Electronics Co., Ltd. Image editing method and electronic device supporting same
WO2019216630A1 (en) * 2018-05-11 2019-11-14 삼성전자 주식회사 Image editing method and electronic device supporting same
US11860888B2 (en) 2018-05-22 2024-01-02 Snap Inc. Event detection system
CN110581950A (en) * 2018-06-11 2019-12-17 索尼公司 Camera, system and method for selecting camera settings
US11367234B2 (en) 2018-07-24 2022-06-21 Snap Inc. Conditional modification of augmented reality object
US10679393B2 (en) 2018-07-24 2020-06-09 Snap Inc. Conditional modification of augmented reality object
US11670026B2 (en) 2018-07-24 2023-06-06 Snap Inc. Conditional modification of augmented reality object
US10789749B2 (en) 2018-07-24 2020-09-29 Snap Inc. Conditional modification of augmented reality object
US10943381B2 (en) 2018-07-24 2021-03-09 Snap Inc. Conditional modification of augmented reality object
US11676319B2 (en) 2018-08-31 2023-06-13 Snap Inc. Augmented reality anthropomorphtzation system
US11450050B2 (en) 2018-08-31 2022-09-20 Snap Inc. Augmented reality anthropomorphization system
US10997760B2 (en) 2018-08-31 2021-05-04 Snap Inc. Augmented reality anthropomorphization system
US11455082B2 (en) 2018-09-28 2022-09-27 Snap Inc. Collaborative achievement interface
US11704005B2 (en) 2018-09-28 2023-07-18 Snap Inc. Collaborative achievement interface
US11799811B2 (en) 2018-10-31 2023-10-24 Snap Inc. Messaging and gaming applications communication platform
US11199957B1 (en) 2018-11-30 2021-12-14 Snap Inc. Generating customized avatars based on location information
US11698722B2 (en) 2018-11-30 2023-07-11 Snap Inc. Generating customized avatars based on location information
US11812335B2 (en) 2018-11-30 2023-11-07 Snap Inc. Position service to determine relative position to map features
US11558709B2 (en) 2018-11-30 2023-01-17 Snap Inc. Position service to determine relative position to map features
WO2020145744A1 (en) * 2019-01-11 2020-07-16 엘지전자 주식회사 Camera device and electronic device including same
US11877211B2 (en) 2019-01-14 2024-01-16 Snap Inc. Destination sharing in location sharing system
US11751015B2 (en) 2019-01-16 2023-09-05 Snap Inc. Location-based context information sharing in a messaging system
US11294936B1 (en) 2019-01-30 2022-04-05 Snap Inc. Adaptive spatial density based clustering
US11693887B2 (en) 2019-01-30 2023-07-04 Snap Inc. Adaptive spatial density based clustering
US11809624B2 (en) 2019-02-13 2023-11-07 Snap Inc. Sleep detection in a location sharing system
US11500525B2 (en) 2019-02-25 2022-11-15 Snap Inc. Custom media overlay system
US11574431B2 (en) 2019-02-26 2023-02-07 Snap Inc. Avatar based on weather
US11301117B2 (en) 2019-03-08 2022-04-12 Snap Inc. Contextual information in chat
US11868414B1 (en) 2019-03-14 2024-01-09 Snap Inc. Graph-based prediction for contact suggestion in a location sharing system
US11852554B1 (en) 2019-03-21 2023-12-26 Snap Inc. Barometer calibration in a location sharing system
US11740760B2 (en) 2019-03-28 2023-08-29 Snap Inc. Generating personalized map interface with enhanced icons
US11249614B2 (en) 2019-03-28 2022-02-15 Snap Inc. Generating personalized map interface with enhanced icons
US11361493B2 (en) 2019-04-01 2022-06-14 Snap Inc. Semantic texture mapping system
US11206615B2 (en) 2019-05-30 2021-12-21 Snap Inc. Wearable device location systems
US11606755B2 (en) 2019-05-30 2023-03-14 Snap Inc. Wearable device location systems architecture
US11785549B2 (en) 2019-05-30 2023-10-10 Snap Inc. Wearable device location systems
US11917495B2 (en) 2019-06-07 2024-02-27 Snap Inc. Detection of a physical collision between two client devices in a location sharing system
US11601783B2 (en) 2019-06-07 2023-03-07 Snap Inc. Detection of a physical collision between two client devices in a location sharing system
US11714535B2 (en) 2019-07-11 2023-08-01 Snap Inc. Edge gesture interface with smart interactions
US11821742B2 (en) 2019-09-26 2023-11-21 Snap Inc. Travel based notifications
US11218838B2 (en) 2019-10-31 2022-01-04 Snap Inc. Focused map-based context information surfacing
US11429618B2 (en) 2019-12-30 2022-08-30 Snap Inc. Surfacing augmented reality objects
US11128715B1 (en) 2019-12-30 2021-09-21 Snap Inc. Physical friend proximity in chat
US11729343B2 (en) 2019-12-30 2023-08-15 Snap Inc. Including video feed in message thread
US11893208B2 (en) 2019-12-31 2024-02-06 Snap Inc. Combined map icon with action indicator
US11343323B2 (en) 2019-12-31 2022-05-24 Snap Inc. Augmented reality objects registry
US11228551B1 (en) 2020-02-12 2022-01-18 Snap Inc. Multiple gateway message exchange
US11888803B2 (en) 2020-02-12 2024-01-30 Snap Inc. Multiple gateway message exchange
US11765117B2 (en) 2020-03-05 2023-09-19 Snap Inc. Storing data based on device location
US11516167B2 (en) 2020-03-05 2022-11-29 Snap Inc. Storing data based on device location
US11619501B2 (en) 2020-03-11 2023-04-04 Snap Inc. Avatar based on trip
US11776256B2 (en) 2020-03-27 2023-10-03 Snap Inc. Shared augmented reality system
US11430091B2 (en) 2020-03-27 2022-08-30 Snap Inc. Location mapping for large scale augmented-reality
US11915400B2 (en) 2020-03-27 2024-02-27 Snap Inc. Location mapping for large scale augmented-reality
US11314776B2 (en) 2020-06-15 2022-04-26 Snap Inc. Location sharing using friend list versions
US11483267B2 (en) 2020-06-15 2022-10-25 Snap Inc. Location sharing using different rate-limited links
US11290851B2 (en) 2020-06-15 2022-03-29 Snap Inc. Location sharing using offline and online objects
US11503432B2 (en) 2020-06-15 2022-11-15 Snap Inc. Scalable real-time location sharing framework
US11676378B2 (en) 2020-06-29 2023-06-13 Snap Inc. Providing travel-based augmented reality content with a captured image
US11902902B2 (en) 2021-03-29 2024-02-13 Snap Inc. Scheduling requests for location data
US11601888B2 (en) 2021-03-29 2023-03-07 Snap Inc. Determining location using multi-source geolocation data
US11606756B2 (en) 2021-03-29 2023-03-14 Snap Inc. Scheduling requests for location data
US11645324B2 (en) 2021-03-31 2023-05-09 Snap Inc. Location-based timeline media content system
US11829834B2 (en) 2021-10-29 2023-11-28 Snap Inc. Extended QR code

Also Published As

Publication number Publication date
TW201517620A (en) 2015-05-01
CN104580878A (en) 2015-04-29
TWI549503B (en) 2016-09-11
CN104580878B (en) 2018-06-26
DE102014010152A1 (en) 2015-04-30

Similar Documents

Publication Publication Date Title
US20150116529A1 (en) Automatic effect method for photography and electronic apparatus
US11210799B2 (en) Estimating depth using a single camera
US11418714B2 (en) Image processing device, method of processing image, image processing program, and imaging device
US9544574B2 (en) Selecting camera pairs for stereoscopic imaging
KR102565513B1 (en) Method and apparatus for multiple technology depth map acquisition and fusion
US9591237B2 (en) Automated generation of panning shots
US8508622B1 (en) Automatic real-time composition feedback for still and video cameras
US8497920B2 (en) Method, apparatus, and computer program product for presenting burst images
US10139218B2 (en) Image processing apparatus and image processing method
WO2015180684A1 (en) Mobile terminal-based shooting simulation teaching method and system, and storage medium
KR101930460B1 (en) Photographing apparatusand method for controlling thereof
WO2014165472A1 (en) Camera obstruction detection
CN103188434B (en) Method and device of image collection
CN103222259A (en) High dynamic range transition
US8558935B2 (en) Scene information displaying method and apparatus and digital photographing apparatus using the scene information displaying method and apparatus
US9953220B2 (en) Cutout object merge
US20160104291A1 (en) Image refocusing
KR102272310B1 (en) Method of processing images, Computer readable storage medium of recording the method and an electronic apparatus
CN110581950B (en) Camera, system and method for selecting camera settings
CN112017137A (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
US20120229678A1 (en) Image reproducing control apparatus
KR102146856B1 (en) Method of displaying a photographing mode using lens characteristics, Computer readable storage medium of recording the method and a digital photographing apparatus.
CN104735353A (en) Method and device for taking panoramic photo
CN104793910A (en) Method and electronic equipment for processing information
WO2017071560A1 (en) Picture processing method and device

Legal Events

Date Code Title Description
AS Assignment

Owner name: HTC CORPORATION, TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WU, JING-LUNG;CHUEH, HSIN-TI;TSENG, FU-CHANG;AND OTHERS;SIGNING DATES FROM 20140515 TO 20140516;REEL/FRAME:033301/0358

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION