If you are over Instagram and want to free yourself from the social media giant, here's how you can delete your account.
The post How to delete your Instagram account appeared first on Popular Photography.
]]>Instagram has had its fair share of controversy lately, so deleting your account may seem more appealing than ever. Maybe you’ve decided that your privacy isn’t protected or respected enough by the social media company. Or perhaps you’re worn down by the constant barrage of information or how easy it is to compare your life to others. Any number of variables could have you wondering how to delete your Instagram account.
No matter the reason, if you’ve decided you’ve had it with Instagram, you have a few options. You can simply delete the app and do nothing more, temporarily deactivate or disable your account, or delete your account, which is permanent.
If you want just a casual, brief breakup from Instagram, deleting the app is the easiest way to do so. Deleting the app will remove any temptation to open it up and start scrolling again, so this is a great option if you need a social media detox but want to come back eventually.
However, this will still leave you the option to visit Instagram from your browser (if the lackluster user interface isn’t enough to discourage you from doing so).
Deleting the Instagram app does not change how others see or interact with your account. So, if you want something that will hide your account, but you aren’t quite ready to fully end things forever, deactivating or disabling your Instagram account may be the way to go. When your account is disabled, everything associated with it, including your profile, likes, comments, and photos, will be hidden from others. Disabling your account is a temporary measure, and you can reactivate it whenever you want by logging back in.
Unfortunately, Instagram does not allow you to disable your account from the app. The only way to do so is from a computer or in a web browser on your phone. Here’s how to disable your Instagram account:
If you want to get rid of your account for security or privacy reasons, disabling it won’t be enough, as all of your information will still be there. Or maybe you have other reasons for wanting a more permanent solution. Either way, you’ll want to delete your account, not just disable it.
While you will have 30 days to change your mind, deleting your account is permanent. All of your photos, followers, likes, and comments will be gone. It is a way to remove your Instagram account from the world of social media entirely. Because of this, you may want to download a copy of your data. Instagram provides instructions on how to do so. Unfortunately, Instagram doesn’t make it all that easy to find where to go to delete an account. But, once you are ready, follow the steps (and link) below.
For those who don’t have access to a computer, being able to delete an account on a phone would be ideal. Unfortunately, despite being almost exclusively an app-based platform, Instagram does not allow you to close an Instagram account via the app. Instead, just like deactivating your account, you’ll have to use your mobile web browser.
Once you have deleted your Instagram account, you have 30 days to change your mind and reinstate your account. Your account will be invisible to others during that time, but you can log in and reactivate your account within that 30 days if you wish.
The post How to delete your Instagram account appeared first on Popular Photography.
Articles may contain affiliate links which enable us to share in the revenue of any purchases made.
]]>Whether you're shooting Android or iPhone, here's how to get the most out of your device's built-in camera app.
The post How to unlock your smartphone camera’s best hidden features appeared first on Popular Photography.
]]>What could be more fundamental to photography today than our smartphone cameras? They’re ever-present, ready in moments, and the technology behind them makes it easy to capture great photos in most situations. And yet, I regularly encounter people who are unaware of many of the core functions of the built-in camera app.
Smartphone camera fundamentals extend beyond just “push the big button.” Some tools help you set up the shot, and some give you more control over the exposure. A few are just plain convenient or cool. However, these features aren’t always easy to find. That’s where we come in.
But first, for these examples, I’m using the two phones I have at hand: an iPhone 13 Pro running iOS 16 and a Google Pixel 6 Pro running Android 13. I’m also focusing just on the built-in camera apps; for even more manual control, you can find third-party apps in the app stores. Many of the camera features overlap between iOS and Android operating systems, and it’s possible that some may not be available on older models, or are accessible in a different way. If you see something here that doesn’t match with what you see, break out the manual—I mean, search Google—and see if it’s available for yours.
Most people perform the usual dance of unlocking the phone, finding the camera app, and tapping to launch it. By that time, the moment you were trying to capture might be gone. There are faster ways.
Related: Composition in the age of AI – Who’s really framing the shot?
On the iPhone’s lock screen, swipe right-to-left to jump straight to the camera app without unlocking the phone at all. You can also press the camera icon on the lock screen. On the Pixel, double-press the power button from any screen.
When the phone is unlocked, a few more options are available. On both phones, press and hold the camera app icon to bring up a menu of shooting modes, such as opening the app with the front-facing selfie camera active.
I also like the ability to double-tap the back of the phone to launch the camera. On the iPhone, go to Settings > Accessibility > Touch > Back Tap and choose Camera for the Double Tap (or Triple Tap) option. In Android, go to Settings > System > Gestures > Quick Tap > Open app and choose Camera.
Related: Outsmart your iPhone camera’s overzealous AI
If you miss the tactile feedback of pressing a physical shutter button, or if hitting the software button introduces too much shake, press a volume button instead.
On both phones, pressing either volume button triggers the shutter. Holding a button starts recording video, just as if you hold your finger on the virtual shutter button.
On the iPhone, you can also set the volume up button to fire off multiple shots in burst mode: go to Settings > Camera > Use Volume Up for Burst.
The camera apps do a good job of determining the proper exposure for any given scene—if you forget that “proper” is a loaded term. You do have more control, though, even if the interfaces don’t make it obvious.
On the iPhone, tap anywhere in the preview to set the focus and meter the exposure level based on that point. Even better (and this is a feature I find that many people don’t know about), touch and hold a spot to lock the focus and exposure (an “AE/AF LOCK” badge appears). You can then move the phone to adjust the composition and not risk the app automatically resetting them.
Once the focus and exposure are set or locked, lift your finger from the screen and then drag the sun icon that appears to the right of the target box to manually increase or decrease the exposure. A single tap anywhere else resets the focus and exposure back to automatic.
On the Pixel, tap a point to set the focus and exposure. That spot becomes a target, which stays locked even as you move the phone to recompose the scene. Tapping also displays sliders you can use to adjust white balance, exposure, and contrast. Tap the point again to remove the lock, or tap elsewhere to focus on another area.
We think of “the camera” on our phones, but really, on most modern phones, there are multiple cameras, each with its own image sensor behind the array of lenses. So when you’re tapping the “1x” or “3x” button to zoom in or out, you’re switching between cameras.
Whenever possible, stick to those preset zoom levels. The 1x level uses the main camera (what Apple calls the “wide” camera), the 3x level uses the telephoto camera, and so on. Those are optical values, which means you’ll get a cleaner image as the sensor records the light directly.
But wait, what about using the two-finger pinch gesture to zoom in or out? Or, you can drag left or right on the zoom selection buttons to reveal a circular control (iPhone) or slider (Android) to let you compose your scene without needing to move, or even zoom way into 15x or 20x.
It’s so convenient, but try to avoid it if possible. All those in-between values are calculated digitally: the software is interpolating what the scene would look like at that zoom level by artificially enlarging pixels. Digital zoom technology has improved dramatically over the years, but optical zoom is still the best option.
Speaking of switching, the camera apps feature many different shooting modes, such as Photo, Video, and Portrait. Instead of tapping or trying to drag the row of mode names, on both iOS and Android, simply swipe left or right in the middle of the screen to switch modes.
Whether you subscribe to the “rule of thirds” or just want some help keeping your horizons level, the built-in grid features are handy.
In iOS, go to Settings > Camera > Grid and turn the option on. In Android, you can choose from three types of grids by going to the settings in the camera app, tapping More Settings, and choosing a Grid Type (such as 3 x 3).
The grid on the iPhone, and a related setting called Framing Hints on the Pixel, also enable a horizontal level. When you’re holding the phone parallel to the ground or a table, a + icon appears in the middle of the screen on both models. As you move, the phone’s accelerometer indicates when you’re not evenly horizontal by displaying a second + icon. Maneuver the phone so that both icons line up to ensure the camera is horizontally level.
Both camera systems are great about providing more light in dark situations, whether that’s turning on the built-in flash or activating Night mode (iOS) or Night Sight (Android). The interfaces for controlling those are pretty minimal, though.
On the iPhone, tap the flash icon (the lightning bolt) to toggle between Off and Auto. For more options tap the carat (^) icon, which replaces the camera modes beneath the preview with buttons for more features. Tap the Flash button to choose between Auto, On, and Off.
On the Pixel, tap the Settings button in the camera app and, under More Light, tap the Flash icon (another lightning bolt).
The Pixel includes its Night Sight mode in the More Light category. When it’s enabled, Night Sight automatically activates in dark situations—you’ll see a crescent moon icon on the shutter button. You can temporarily deactivate this by tapping the Night Sight Auto button that appears to the right of the camera modes.
The iPhone’s Night mode is controlled by a separate button, which looks like a crescent moon with vertical stripes indicating a dark side of the moon. Tap it to turn Night mode on or off. Or, tap the carat (^) icon and then tap the Night mode button to reveal a sliding control that lets you choose an exposure time beyond just Auto (up to 30 seconds in a dark environment when the phone is stabilized, such as on a tripod).
As with every camera—smartphone or traditional—there are plenty of features to help you get the best shot. Be sure to explore the app settings and the other buttons (such as setting self-timers or changing the default aspect ratio) so that when the time comes, you know exactly which smartphone camera feature to turn to.
The post How to unlock your smartphone camera’s best hidden features appeared first on Popular Photography.
Articles may contain affiliate links which enable us to share in the revenue of any purchases made.
]]>Say goodbye to single slide stories.
The post How to post multiple photos to your Instagram story appeared first on Popular Photography.
]]>This story originally appeared on Popular Science.
If you’ve been uploading, editing, and posting one image at a time to your Instagram story, you’re not alone. That said, you don’t have to struggle like you’re sweating over the social media equivalent of a printing press—you can publish to your story in bulk.
The ability to post multiple photos to your Instagram story at once has been around since at least 2018, but we won’t judge you for learning about it today. It’s impossible to keep up with every tiny change Meta and the other tech companies add to their constantly changing apps, especially when you’ve fallen into a routine that works for you.
Experienced posters will know the process starts as any Instagram story would: by tapping the plus icon in the top right of the app’s main page, the new story icon (a blue plus sign on top of your profile picture) at the far left end of all your friends’ stories or the identical add to story icon inside your existing story. If this is your first foray into stories, well, congratulations on starting off more well-equipped than we did.
Once you’re on the story creation page, tap Select in the top right. Then choose up to 10 images by touching the ones you want, in the order you want them to appear in your story. Instagram will help you out by marking each selected image with a numbered blue circle, so you’ll know exactly how they’ll be arranged. You can’t move them around later, so make sure you get the sequence right before you continue.
Related: How to disable your Instagram profile from being embedded on a website
With your images in order, tap the arrow icon in the bottom right. The first image will appear on screen, and you can edit it as you would any other story post. But before you do anything else,, take a moment to double-check that you’ve grabbed all the pics you want to share and that they’re arranged properly. If something’s wrong, you can hit the arrow icon in the top left to go back to the selection screen, but you’ll lose all of your edits.
When you’re ready, you can edit the other slides by tapping their thumbnails in the carousel at the bottom of the screen.
Finally, hit the arrow icon in the bottom right to open the sharing menu—you can choose to post the pics to Your story, Close friends, or send them as a direct message. Make your selection and hit Share. Instagram may also ask if you’d like to add them to your Story highlights or send them as a direct Instagram message to anyone, but you can tap Done to ignore it.
Now you’ll never have to slog through a slew of concert, night-out, or fancy dinner pics again, and that’s pretty sweet.
The post How to post multiple photos to your Instagram story appeared first on Popular Photography.
Articles may contain affiliate links which enable us to share in the revenue of any purchases made.
]]>Overwhelmed by a wave of annoying 'Suggested Posts' in your Instagram feed? Here's how to silence them.
The post How to disable ‘Suggested Posts’ on Instagram appeared first on Popular Photography.
]]>When Instagram unveiled a way to bring back the chronological feed, the announcement also seemed to hint that Meta might soon be adding a lot more suggested content to your algorithmic feed. Well, if you’ve been on Instagram in the past few weeks, you’ve probably noticed that that time has come. My regular feed has been overwhelmed with algorithmically recommended content, aka, “Suggested Posts.” If yours has too, the good news is there’s a way to stop it.
A lot of the time, the same suggested accounts will keep popping up in your feed. If there are one or two accounts you just don’t want to see again, getting rid of them is easy.
Tap the little X above the next suggested post from the account you don’t like.
Then tap Don’t Suggest Posts from [Account Name].
And if it’s the kind of stuff they post that you don’t like, you can tap Don’t Suggest Posts Related to [Account Name].
If it’s suggested posts in general that are annoying you (like it is for me), you can snooze them for 30 days. After that, they come back with a vengeance but you can just snooze them again.
Tap the little X above the next suggested post you see.
Then tap Snooze All Suggested Posts in Feed for 30 Days.
And just like that, you won’t see another suggested post for a month.
While Instagram wants to be the most “engaging” experience possible (to keep you from spending any time on competitors’ platforms, like TikTok), you can easily improve things in a few ways.
First, we recommend using the recently released Favorites feature. Not only does it give you a dedicated chronological feed for the 50 accounts you care most about, but it also makes them appear at the top of the regular feed. Another tip? Unfollow, unfollow, unfollow. If an account is posting too much or spamming your feed, don’t ignore it, unfollow it.
We also find it helps to set a daily time limit. If you find you use Instagram more than you’d like, in the menu go to Your Activity > Time Spent then tap Set Daily Time Limit. You’ll now get a reminder if you spend more than the set limit using Instagram. And finally, consider taking the Instagram app off your Home screen. If it’s too easy to access, you’re more likely to do it subconsciously. Hide the app in a folder!
The post How to disable ‘Suggested Posts’ on Instagram appeared first on Popular Photography.
Articles may contain affiliate links which enable us to share in the revenue of any purchases made.
]]>You don't need a fancy-pants camera to capture awesome firework pics this summer.
The post How to shoot better firework photos with your smartphone appeared first on Popular Photography.
]]>If you’ve ever tried to snap photos of fireworks with your phone, you probably weren’t very happy with the results. Thankfully, with a bit of care and the right tools, you can get much better images.
While it’s tempting to zoom in when you’re taking photos of things like fireworks, it’s a bad idea. Modern smartphones have different lenses, but also different sensors. Take the iPhone 13 Pro. It has three 12-megapixel cameras: the main wide-angle, an ultra-wide-angle, and a telephoto. All three have very different specs.
The ultra-wide-angle lens has an aperture of f/1.8 while the telephoto lens has an aperture of f/2.8. Both use 1/3.4-inch sensors. The main wide camera, however, has an aperture of f/1.5 and a 1/1.65-inch sensor. Not only does its wider aperture let in more light, but its larger sensor (it’s got roughly 3.6 times more area) is better able to capture it. It doesn’t matter that all three have the same 12-megapixel resolution, at night when you’re shooting fireworks, the main camera is far superior.
So, whatever smartphone you use, the best camera to use is normally the default one. (If you want to dig deeper, check the specs.)
Your smartphone is probably great at automatically snapping photos in most situations, but night-time fireworks may not be one of them. To get the best results you need to be able to set both the shutter speed and ISO.
The exact settings will vary depending on your situation, but some good values to start with are:
Oh, and make sure the flash is turned off.
Some smartphones have default camera apps that allow you to manually adjust settings. If not, you have plenty of options. On iPhone, check out Manual ($3.99) and ProCamera. ($14.99). We also love Halide Mark II (but at $12/year after a free trial, it’s harder to recommend if you don’t take a lot of smartphone photos). On Android, Open Camera (free), Camera FV-5 ($3.95), and ProShot ($4.99) are all also worth checking out.
With shutter speeds between one and five seconds long, any optical image stabilization in your smartphone is going to be working pretty hard.
If you have a smartphone tripod (we love Joby’s GorillaPod), it’s a good idea to use it. Otherwise, go with a slightly shorter shutter speed (say, one or two seconds), hold your smartphone close, and brace your arms against your torso.
If you’re standing around waiting for the fireworks to start (or even better, are involved in getting the display going) then get your smartphone prepped and ready to go ahead of time. Work out where you want to stand, where you want to rest your tripod (or balance your smartphone), and where you need to focus. This is also an opportunity to dial in your settings and try out a few different shutter speeds to see how they work with the environment you’re in.
It’s unlikely you’ll get an amazing firework photo the first time you tap the shutter button. The tenth, though? Much more likely!
While you won’t be able to use burst mode if you have your shutter speed set to a second or slower, you can still keep tapping away at the shutter button yourself. If you find this is adding camera shake, consider using a short shutter delay.
The more photos you shoot, the better you’ll get at timing them, so don’t be afraid to experiment! Depending on your shutter speed, you will need to take your photo sometime between the firework launching and its initial explosion, to get the best results.
Here at PopPhoto, we’re big believers that a photo isn’t done until you have edited it. Open the image in your favorite editing app (even if it’s Instagram), increase the saturation, and maybe the contrast. Crop it so it’s well framed and you’re good to go!
The post How to shoot better firework photos with your smartphone appeared first on Popular Photography.
Articles may contain affiliate links which enable us to share in the revenue of any purchases made.
]]>Shoot a 'professional-looking' portrait every time.
The post How to get a blurry background in portraits appeared first on Popular Photography.
]]>A blurry, out-of-focus background is often seen as a sign of a professional or high-quality portrait. While it’s certainly a bit of a simplistic take, it’s undeniable that portraits with a sharp subject and a soft, creamy background are popular, and a look that many photographers want to create. Here’s how.
In photography, the “depth of field” of an image is the amount of it that’s in focus. Most smartphone snapshots and landscape photos have a large depth of field so (almost) everything looks clear and in focus. Portraits with a blurred background, though, have a really shallow depth of field; only the subject (or even a part of the subject like their face or eyes) is actually in focus.
The depth of field of an image is determined by three things:
All else being equal, the wider your lens’ aperture, the shallower the depth of field will be. The physics of this gets quite complicated fast, but the important thing to understand is that this holds true regardless of what lens or camera you’re using.
A wide aperture, like f/1.4 or f/2, will result in a noticeably shallower depth of field (i.e. a blurrier background) than a narrower aperture like f/8 or f/11. And very narrow apertures like f/16 or f/22 will result in nearly everything in an image being in focus. There are other tradeoffs that come with using a wide aperture (most lenses are sharpest stopped down a bit) but for a nice blurry background, the widest possible aperture is a good way to go.
The further something is from the point of focus in an image, the more out of focus it will be. For portraits, this means the greater the distance between your subject and the background, the blurrier your background will appear.
The relative distance between you, your subject, and the background also matters here. The closer you are to your subject compared to the distance between them and the background, the shallower the depth of field will also appear to be.
Imagine taking a portrait of someone a few feet away with a wall behind them. If they’re leaning against the wall, it will be almost the same distance from the camera as they are, so it will probably appear pretty in focus. On the other hand, if they’re standing 20 feet in front of the wall, it’s going to be as out of focus as can be.
Related: An introduction to boudoir photography
Focal length doesn’t directly affect the depth of field of an image, however, it does affect the kind of images you can take. The longer the focal length of a lens, the more prominent the background will appear; when combined with a wide aperture and sufficient subject-background distance, you get a blurry background and in-focus subject. It’s not that you can’t get a blurry background with a wide-angle lens, but you have to stand very close to your subject which can create distortion and otherwise make for a weird-looking portrait.
Moderate telephoto lenses—between say, 50mm and 150mm on a full-frame camera—make it the easiest to manipulate the distance between you, your subject, and the background while getting a well-composed shot. This is why that focal range is so popular with portrait photographers.
Now that you understand what contributes to a shallow depth of field you should be starting to piece together how to get the effect you’re looking for. Let’s lay it all out, though.
First, choose a moderate telephoto lens with a wide aperture. A 50mm f/1.8 is ideal and can be picked up for most cameras for around $150. If you have a DSLR or mirrorless camera with a kit lens, you will get better results at 55mm (or longer if your lens has more reach) with the aperture set to f/5.6 than at 18mm with the aperture at f/3.5. (As with all things in photography, a shallow depth of field involves trade-offs.)
Put your camera in aperture priority mode and set the aperture as wide as it will go. Set your ISO to auto, since with the aperture wide open, getting a fast shutter speed for a good exposure won’t be a problem in most lighting conditions.
Stand close enough to your subject that you can take either a headshot (where just their face and shoulders are in the shot) or a half-body portrait. Position your subject so they’re at least as far from the background as you are from them. (While this isn’t a fixed rule, it works as a general guideline in most situations.) Get them to pose and take the shot.
As you play around with this technique you’ll notice that the background matters, even though it’s out of focus. Cool textures, like brickwork or leaves, make for nicer backgrounds than busy scenes, like a crowded street.
Unfortunately, in most cases, it’s impossible to get a portrait like this with a smartphone without using a “portrait mode” which relies on computational photography to blur the background after the shot. The results from modern smartphones are getting really good, but it’s a totally different technique.
If you are trying to get this effect with your smartphone, some of the tips will still work though. For starters, position your subject so they’re as far from the background as possible. While your smartphone doesn’t use optics, it may use a depth sensor and/or machine learning to work out what part of your frame is your subject. The more they are separated from the background, the easier time your phone will have and the better the results you will get.
It’s also important to stand close enough to your subject to get a headshot, or at most, a half-body portrait. Again, we’re trying to maximize the chances that your smartphone will be able to clearly identify your subject and the background. The closer you are (and the further away the background is) the better.
Also, be sure to be mindful of branches, furniture, or anything else that is between you and your subject! They can throw off your smartphone and give you weird results.
The post How to get a blurry background in portraits appeared first on Popular Photography.
Articles may contain affiliate links which enable us to share in the revenue of any purchases made.
]]>With proms, graduations, and summer bbq season upon us, now is as good a time as any to brush up on portrait photography best practices.
The post 10 ways to improve your outdoor smartphone portrait shots appeared first on Popular Photography.
]]>With proms, college graduations, and even just plain old family barbecues coming up over the next few months, we thought we’d put together some of the best ways to improve the outdoor portraits you shoot with your smartphone. With bright midday sunlight, bustling crowds, and many other challenges, these can be tricky shots to get right. Here are some ways to nail the shot.
Related: How to make a stop-motion movie with a smartphone
Your smartphone lives in your pocket and gets handled dozens of times per day so, let’s be honest, the lens probably has a smudge or two on it. Not only can these make your photos blurry, but if there are bright lights around they can increase the lens flare and glare you see in the final image. Before taking an important portrait, take two seconds to clean your smartphone camera with a lens or glasses cloth and, if you have it, a bit of glasses cleaning spray.
If you want a lovely portrait, don’t stand back. Get close and try for a headshot or half-length portrait. Sure, snap a few photos showing off your subject’s full outfit if you like, but for a wall-worthy portrait, it’s better to be closer.
The worst place to shoot portraits is outside on a bright sunny day. The overhead sun casts harsh shadows across people’s faces that tend not to look great. The good news is there’s a simple fix: shoot in the shade.
My favorite place to capture outdoor portraits is under a tree or archway, but any kind of shade will do. Find a nice laneway, alley, or just a brick wall that will block out the sun’s hard rays.
The secret to getting one great photo is to shoot 50 bad ones—and this applies in almost all situations. If you’re taking a photo of someone, snap 10 or 15 photos in quick succession. In some of them, they’ll blink, grimace, or look away. But in one or two they’ll be posing perfectly.
Your smartphone does a pretty good job of making photos look okay, but if you want a really great portrait, open it in Lightroom, VSCO, or Snapseed and adjust things until they’re perfect. We’d suggest tweaking the exposure, bumping the contrast and saturation, and even adding a vignette or some color toning. If all that sounds like too much, even just cropping it so your subject dominates the frame can turn an okay portrait into a profile pic.
If your smartphone has a telephoto lens, use it. The wide-angle camera is great for getting lots into a photo, but a telephoto is better for portraits since it will normally have a field-of-view that more closely matches how we all see the world.
With that said, if your camera doesn’t have a telephoto don’t stress it. You can get great portraits with any lens. Just whatever you do, don’t use digital zoom.
Most modern smartphones have a “Portrait Mode” that will try and blur the background in your photos. These can work great and it’s worth taking a few photos using it, but you should also snap a few photos without Portrait Mode on just to cover all your bases. You don’t want the only photo from an important event to have some weird blur around the edges of your subject because your smartphone mistook their mortar board for the backdrop.
Portraits have two things: a subject and a background. Most of the time we photographers focus on the subject (pun intended) but for truly great portraits, it’s worth taking some time to find a nice background too. An old wall, a field of flowers, even just the body of a tree are all much better than a busy crowd.
While we often discourage photographers from “chimping” or constantly checking the photos they shoot on the back of their digital camera, if you’re trying to capture a special moment or event, then that is a rule you should break. Once you’ve shot a few pictures have a quick look and make sure everything is looking right. If it is, great. If not, you’ve got a chance to fix things on location!
If you take a nice photo of someone, share it with them. Too many gorgeous portraits gather digital dust on peoples’ smartphones. Once you’re home from an event and you’ve edited the portraits, pop them in a shared Google Drive or iCloud Album.
The post 10 ways to improve your outdoor smartphone portrait shots appeared first on Popular Photography.
Articles may contain affiliate links which enable us to share in the revenue of any purchases made.
]]>Plus: a first look at macOS 13 Ventura, iOS 16, and more.
The post Meet Apple’s powerful new M2 MacBook Air appeared first on Popular Photography.
]]>Apple’s Worldwide Developer Conference (WWDC) kicked off this week with the announcement of a new MacBook Air and first looks at macOS 13 Ventura, iOS 16, iPadOS 16, and watchOS 9. It’s a giant stew of features and technologies meant to excite developers and prepare them for the software releases later this year.
But what about photographers? Several photo-related changes are coming, including improvements that take advantage of computational photography. Given this column’s interest in AI and ML technologies, that’s what I’m mostly going to focus on here.
Keep in mind that the operating system releases are currently available only as betas to developers, with full versions coming likely in September or October. As such, it’s possible that some announced features may be delayed or canceled before then. Also, Apple usually saves some details in reserve, particularly regarding the hardware capabilities of new iPhone models.
That said, here are the things that stood out to me.
Photographers’ infamous Gear Acquisition Syndrome isn’t limited to camera bodies and lenses. The redesigned MacBook Air was the noteworthy hardware announcement, specifically because it’s powered by a new M2 processor.
Related: Testing the advantages of Apple’s ProRAW format
In short, the M2 is faster and better than the M1, which itself was a stark improvement over the Intel-based processors Apple had been using before transitioning to its own silicon. A few standout specs that will interest photographers include: The memory bandwidth is 100 GB/s, 50 percent more than the M1, which will speed up operations in general. (The M-series architecture uses a unified pool of memory for CPU and GPU operations instead of discrete chipsets, increasing performance; up to 24 GB of memory is available on the M2.)
Photographers and videographers will also see improvements due to 10 GPU cores, compared to 8 on the M1, and an improved onboard media engine that supports high bandwidth 8K H.264 and HEVC video decoding, a ProRes video engine enabling playback of multiple 8K and 4K video streams, and a new image signal processor (ISP) that offers improved image noise reduction.
In short, the M2 offers more power while also being highly efficient and battery-friendly. (The battery life I get on my 2021 MacBook Pro with M1 Max processor is unreal compared to my 2019 Intel-based model, and I’ve heard the fan spin up only on a handful of occasions over the past 6 months.)
The MacBook Air’s design reflects the new MacBook Pro’s flattened profile—goodbye to the distinctive wedge shape that defined the Air since its introduction—and includes two Thunderbolt ports and a MagSafe charging port. The screen is now a 13.6-inch Liquid Retina display that supports 1 billion colors and can go up to 500 nits of brightness.
Apple also announced a 13-inch MacBook Pro with an M2 processor in the same older design, which includes a TouchBar but no MagSafe connector. The slight advantage of this model over the new MacBook Air is the inclusion of a fan for active cooling, which allows for longer sustained processing.
The M2 MacBook Air starts at $1199, and the M2 MacBook Pro starts at $1299. The M1-powered MacBook Air remains available as the $999 entry-level option.
Next on my list of interests is the Continuity Camera feature. Continuity refers to technologies that let you pass information between nearby Apple devices, such as copying text on the Mac and pasting it on an iPad. The Continuity Camera lets you use an iPhone 11 or later as a webcam.
Using a phone as a webcam isn’t new; I’ve long used Reincubate Camo software for this (and full disclosure, wrote a few articles for them). Apple brings its Center Stage technology for following subjects in the frame and Portrait Mode for artificially softening the background. It also features a Studio Light setting that boosts the exposure on the subject (you) and darkens the background to simulate external illumination like a ring light. Apple does these things by using machine learning to identify the subject.
But more intriguing is a new Desk View mode: It uses the iPhone’s Ultra-Wide camera and likely some AI technology to apply extreme distortion correction to display what’s on your desk as if you’re looking through a down-facing camera mounted above you. Other participants on the video call still see you in another frame, presumably captured by the normal Wide camera at the same time.
A few new features take advantage of the software’s ability to identify content within images and act on it.
The iPhone in iOS 16 will have a configurable lock screen with options for changing the typeface of the current time and including widgets for getting quick information at a glance. If the wallpaper image includes depth information, such as a Portrait Mode photo of someone, the screen automatically places the time behind them (a feature introduced in last year’s watchOS 8 update). It can also suggest photos from your library that would work well as lock screen images.
Another clever bit of subject recognition is the ability to lift a subject from the background. You can touch and hold a subject, which is automatically identified and extracted using machine learning, and then drag or copy it to another app, such as Messages.
The previous iOS and iPadOS updates added Live Text, which lets you select any text that appears in an image. In the next version, you can also pause any frame of video and interact with the text. Developers will be able to add quick actions to do things like convert currency or translate text.
Apple’s Photos app has always occupied an odd space: it’s the default place for saving and organizing images on each platform, but needs to have enough broad appeal that it doesn’t turn off average users who aren’t looking for complexity. I suspect many photographers turn to apps such as Lightroom or Capture One, but we all still rely on Photos as the gatekeeper for iPhone photos.
In the next update, Apple is introducing iCloud Shared Photo Library, a way for people with iCloud family plans to share a separate photo library with up to six members. Each person can share and receive all the photos, bringing photos from family events together in one library without encroaching on individual personal libraries.
You can populate the library manually, or use person recognition to specify photos where two or more people are together. Or, you can set it up so that when family members are together, photos will automatically be sent to the shared library.
Other Photos improvements include a way to detect duplicates in the Photos app, the ability to copy and paste adjustments between photos or in batches, and more granular undo and redo options while editing.
The last thing I want to mention isn’t related to computational photography, but it’s cool nonetheless. Currently, you can use the Sidecar feature in macOS to use an iPad as an additional display, which is great when you need more screen real estate.
In macOS Ventura and iPadOS 16, an iPad Pro can be set up as a reference monitor to view color-consistent photos and videos as you edit. The catch is that according to Apple’s footnotes, only the 12.9-inch iPad Pro with its gorgeous Liquid Retina XDR display will work, and the Mac must have an M1 or M2 processor. (I added “gorgeous” there; it’s not in the footnotes.)
Speaking of screen real estate, iPadOS 16 finally—finally!—enables you to connect a single external display (up to 6K resolution) and use it to extend the iPad desktop, not just mirror the image. Again, that’s limited to models with the M1 processor, which currently includes the iPad Pro and the iPad Air. But if you’re the type who does a lot of work or photo editing on the iPad, external display support will give you more breathing room.
A new feature called Stage Manager breaks apps out of their full-screen modes to enable up to four simultaneous app windows on the iPad and on the external display. If you’ve ever felt constrained running apps like Lightroom and Photoshop side-by-side in Split View on the same iPad screen, Stage Manager should open things up nicely. Another feature, Display Zoom, can also increase the pixel density to reveal more information on the M1-based iPad’s screen.
I’ve focused mostly on features that affect photographers, but there are plenty of other new things coming in the fall. If nothing else, the iPad finally has its own Weather app and the Mac has a full Clock app. That may not sound like much, but it helps when you’re huddled in your car wondering if the rain will let up enough to capture dramatic clouds before sundown, or when you want a timer to remind you to get to bed at a respectable hour while you’re lost in editing.
The post Meet Apple’s powerful new M2 MacBook Air appeared first on Popular Photography.
Articles may contain affiliate links which enable us to share in the revenue of any purchases made.
]]>Apps like Halide and Camera+ make it easy to bypass your smartphone's computational wizardry for more natural-looking photos.
The post Outsmart your iPhone camera’s overzealous AI appeared first on Popular Photography.
]]>Last weekend The New Yorker published an essay by Kyle Chayka with a headline guaranteed to pique my interest and raise my hackles: “Have iPhone Cameras Become Too Smart?” (March 18, 2022).
Aside from being a prime example of Betteridge’s Law of Headlines, it feeds into the idea that computational photography is a threat to photographers or is somehow ruining photography. The subhead renders the verdict in the way that eye-catching headlines do: “Apple’s newest smartphone models use machine learning to make every image look professionally taken. That doesn’t mean the photos are good.”
The implication there, and a thrust of the article, is that machine learning is creating bad images. It’s an example of a type of nostalgic fear contagion that’s increasing as more computational photography technologies assist in making images: The machines are gaining more control, algorithms are making the decisions we used to make, and my iPhone 7/DSLR/film SLR/Brownie took better photos. All wrapped in the notion that “real” photographers, professional photographers, would never dabble with such sorcery.
(Let’s set aside the fact that the phrase “That doesn’t mean the photos are good” can be applied to every technological advancement since the advent of photography. A better camera can improve the technical qualities of photos, but doesn’t guarantee “good” images.)
I do highly recommend that you read the article, which makes some good points. My issue is that it ignores—or omits—an important fact: computational photography is a tool, one you can choose to use or not.
Related: Meet Apple’s new flagship iPhone 13 Pro & Pro Max
To summarize, Chayka’s argument is that the machine learning features of the iPhone are creating photos that are “odd and uncanny,” and that on his iPhone 12 Pro the “digital manipulations are aggressive and unsolicited.” He’s talking about Deep Fusion and other features that record multiple exposures of the scene in milliseconds, adjust specific areas based on their content such as skies or faces, and fuses it all together to create a final image. The photographer just taps the shutter button and sees the end result, without needing to know any of the technical elements such as shutter speed, aperture, or ISO.
You can easily bypass those features by using a third-party app such as Halide or Camera+, which can shoot using manual controls and save the images in JPEG or raw format. Some of the apps’ features can take advantage of the iPhone’s native image processing, but you’re not required to use them. The only manual control not available is aperture because each compact iPhone lens has a fixed aperture value.
That fixed aperture is also why the iPhone includes Portrait Mode, which detects the subject and artificially blurs the background to simulate the soft background depth of field effect created by shooting with a bright lens at f/1.8 or wider. The small optics can’t replicate it, so Apple (and other smartphone developers) turned to software to create the effect. The first implementations of Portrait Mode often showed noticeable artifacts, the technology has improved in the last half-decade to the point where it’s not always apparent the mode was used.
But, again, it’s the photographer’s choice whether to use it. Portrait Mode is just another tool. If you don’t like the look of Portrait Mode, you can switch to a DSLR or mirrorless camera with a decent lens.
More apt is the notion that the iPhone’s processing creates a specific look, identifying it as an iPhone shot. Some images can appear to have exaggerated dynamic range, but that’s nothing like the early exposure blending processing that created HDR (high dynamic range) photos where no shadow was left un-brightened.
Each system has its own look. Apple’s processing, to my eye, tends to be more naturalistic, retaining darks while avoiding blown-out areas in scenes that would otherwise be tricky for a DSLR. Google’s processing tends to lean more toward exposing the entire scene with plenty of light. These are choices made by the companies’ engineers when applying the algorithms that dictate how the images are developed.
The same applies to traditional camera manufacturers: Fujifilm, Canon, Nikon, Sony cameras all have their own “JPEG look”, which are often the reason photographers choose a particular system. In fact, Chayka acknowledges this when reminiscing over “…the pristine Leica camera photo shot with a fixed lens, or the Polaroid instant snapshot with its spotty exposure.”
The article really wants to cast the iPhone’s image quality as some unnatural synthetic version of reality, photographs that “…are coldly crisp and vaguely inhuman, caught in the uncanny valley where creative expression meets machine learning.” That’s a lovely turn of phrase, but it comes at the end of talking about the iPhone’s Photographic Styles feature that’s designed to give the photographer more control over the processing. If you prefer images to be warmer, you can choose to increase the warmth and choose that style when shooting.
It’s also amusing that the person mentioned at the beginning of the article didn’t like how the iPhone 12 Pro rendered photos, so “Lately she’s taken to carrying a Pixel, from Google’s line of smartphones, for the sole purpose of taking pictures.”
The Pixel employs the same types of computational photography as the iPhone. Presumably, this person prefers the look of the Pixel over the iPhone, which is completely valid. It’s their choice.
I think the larger issue with the iPhone is that most owners don’t know they have a choice to use anything other than Apple’s Camera app. The path to using the default option is designed to be smooth; in addition to prominent placement on the home screen, you can launch it directly from an icon on the lock screen or just swipe from right to left when the phone is locked. The act of taking a photo is literally “point and shoot.”
More important, for millions of people, the photos it creates are exactly what they’re looking for. The iPhone creates images that capture important moments or silly snapshots or any of the unlimited types of scenes that people pull out their phones to record. And computational photography makes a higher number of those images decent.
Of course not every shot is going to be “good,” but that applies to every camera. We choose which tools to use for our photography, and that includes computational photography as much as cameras, lenses, and capture settings.
The post Outsmart your iPhone camera’s overzealous AI appeared first on Popular Photography.
Articles may contain affiliate links which enable us to share in the revenue of any purchases made.
]]>Learning how to set a timer on your iPhone camera can open up more possibilities of how you can use your phone's camera.
The post How to set a timer on your iPhone camera appeared first on Popular Photography.
]]>Self-timers have been common on cameras since the early film days. Modern cameras, including the one built into your iPhone, still offers a timer function and it can be extremely useful for various photography situations. It can open up more opportunities to use your camera and even help you take better photos in some circumstances. While older models of iPhones don’t have built-in timers, if your device has iOS 8 or newer, you should have access to this setting.
If you want to take photos of yourself that don’t have the selfie look with the arm extending out, the timer will give you time to get into position in front of the camera. Sometimes you need some time to adjust or move something into place before the shutter goes off, and if you only have one set of arms, that can be a challenge. The self-timer helps with those situations.
Group photos also represent good opportunities for self-timer photos. If you have a group gathered but no one to take the photo, the photo timer on your iPhone makes the shot possible. The longer timer options can even make photos of big groups that have to be far away from the camera possible, as you will have time to set it up and run back into place.
Beyond selfies and group photos, the timer can be helpful for other reasons. For example, if you want to use your camera in darker conditions and need it to be steady with no movement, the timer can help. That way, nothing is touching the phone to potentially cause camera shake when the shutter goes off. Tapping the screen or pushing the shutter button can introduce photo-ruining shake. It can also be helpful for using your phone to take product shots that may need either a hand model or things held into position.
No matter what you are using the timer on you iPhone camera for, there is one thing that you need to do first: Make sure to set your iPhone up in a stable, sturdy manner. You could lean it against or wedge it into something, as long as the camera isn’t obstructed and the phone won’t slide down. Of course you could avoid all of that inconvenience by simply using a tripod.
There are lots of phone-specific tripods out there, or even tripod mounts for regular tripods. Many of them do the job excellently. Some phone tripods are small and meant to be on tabletops or other surfaces. They won’t extend very high, if at all, so you’ll be stuck with the height of whatever surface you are using, plus the limited size of the tripod. Some extend to heights like standard camera tripods, which may be ideal if you need something more versatile.
Once you know how to set the timer on your iPhone camera, go experiment with it. While these tools have practical uses, the real fun stuff comes from experimentation.
The post How to set a timer on your iPhone camera appeared first on Popular Photography.
Articles may contain affiliate links which enable us to share in the revenue of any purchases made.
]]>