Posts Tagged ‘Pentax’

Update 10.10.2018

24 October 2018

It has been a while since I posted here…once again. I don’t know why that is, but it certainly hasn’t been for lack of anything to post.

Photography-wise, most of my focus lately has been on my old 400mm Tokina. Getting it into sharp focus seems to be an almost impossible task. “Almost” because I refuse to believe it is impossible. For the longest time, I’ve been trying to get “focus trap” to work on the lens with the Pentax K3. No matter what I tried, it just wouldn’t work. It worked with the K10 and *ist, so it was frustrating not being able to use that method.

Then I came across something on the internet that made me search with some terms I would never have thought of using. Sure enough, it turned out that with the K3, Pentax created a setting in the Custom menu to allow or disallow that. Not only that, the name of the menu item, at least to me, isn’t intuitive. Sharp Capture. In hindsight, the name does make some sense. So, now I have the ability to use focus trapping.

Even with that, though, it’s still not locking in with the sharp image I remember. Note to self: put the 400mm on the K10 and/or *ist and verify it works like I seem to remember.

I think ,however, that there is a slight difference where the focal plane is compared to the K10 and *ist bodies. That doesn’t make much sense, but right now it’s the only answer I have. Even with the other manual lenses that I remember working properly on the other two bodies, I have to focus past the subject to get a sharp image.

Right now, my plan is to combine focus trapping with the multiple shot mode. That way, when the focus trapping triggers the rapid sequence of images taken while still adjusting focus manually should hopefully include at least one image in which the image is sharp. Or at least sharpest.

We’ll see.

Well, that idea of focus trapping plus multiple shot mode didn’t work. I guess I’m going to have to try and find another way to do this.

The ham radio arena is the next area to bring up to date. Here, the use of the straight key for Morse code as my computer input keyboard has indeed taught me most of the characters. BUT that only taught me to transmit. Recently, a ham who agreed to be an Elmer (aka Mentor) for me sent me a simple device that activates a LED to the tune of the dits and dahs of Morse code.

I bring that up because once I got this hooked into my radio and everything tuned, I had a chance to actually see some Morse via the LED. I was able to, for the first time, clearly and without any of the usual difficulties of differentiating them, see the Morse code. And that segues nicely into I was able to see the dah-dit-dit but I had absolutely no idea what dah-dit-dit stood for. I actually needed to mentally imagine sending dah-dit-dit with a straight key before I could make the connection to the letter D.

And so I found out what I was afraid of actually happened.

I can now transmit a lot of the characters without even having to think about their dit and dah components. The reverse, however, is not true. I could not decipher that visual representation with the same ease I can send it.

Oh, boy.

Now I’m going to have to add a LED to the straight key I use with the computer. There’s a LED on the Teensy board that flashes as I key the characters but it’s under the board holding the key and not visible. Rats! That would have been a perfect solution.

At least it’s a relatively easy fix and since the onboard LED already flashes while keying, I can use that same pinout, so no code modifications needed. I just have to figure out what color LED to use.

Writing. This is where I hang my head in shame.

I’ve done some editing for other authors, but I’ve done precious little writing of my own. As I mentioned last time, I did start a new Pa’adhe story, but nothing past the opening scene and setting up the tale. I have a reasonably decent story for this one, and it’ll provide the backstory for Scarle, but just sitting down and writing just hasn’t happened. With any luck, writing and posting this will get me going on it.

Unfortunately, I’ve been spending a fair bit of time programming. Unfortunately because otherwise I might have been writing instead.

I bought a cheap 3.5” TFT LCD display that came with NO instructions or paperwork at all. It took me several months to finally locate what seemed to be the same display being sold by another vendor with tutorials and examples. So, that’s now up and running on one of my Arduino Unos.

Plus I’m waiting for a part for a 2004 display (20 chars x 4 lines) so that I can use IIC protocols to program it for use with ham radio. This will be a potential display for viewing and decoding Morse code that comes over the air to my radios. It’s intended to just plug into the headphone jack of the radio and display the detected audio and Morse. Eventually I want to modify that to provide the option of showing, selectively, the following: (1) a bar graph or “LED” display of the dits and dahs, (2) a string of dits and dahs such as …. . .-.. —, or (3) the actual translation of the code to display HELLO. Maybe even other modes, although at the moment those three seem to cover all bases for me.

UGH! That’s enough, this is already longer than planned and there’s more like trips. I’ll save those for another time.

Advertisements

A Return to Stereograms

4 April 2018

I have mentioned working on stereograms previously. These last couple of weeks have seen me focused on them.

Stereogram created from drone video taken at Wickahoney. See text for details.

Most that I have done before are close-ups, if you will, or portrait oriented.

I wanted to play with stereograms some more, this time focusing on landscapes. My goals were, first, to get them working consistently and second to hopefully work out any rules unique to stereograms.

In my mind’s eye, I remember sitting on the floor at my grandparent’s with a big box of stereograms and a now antique viewer. I would pick out a card with it’s two slides, read the caption, drop it in the holder, and clap the stereoscope to my face. I remember being fascinated how I could see a 3D version of a scene and how it contrasted from the pictures on the wall.

More, I remember most of them were landscapes.

Now, I’ll grant you my memories of those images are likely rose colored by time and they may not have been as fantastically 3D as I seem to remember. Most indubitably, though, there were hundreds of landscapes and not so many of flowers, people, or objects.

My goal in this recent project was to create valid 3D landscape stereograms. I also needed to work out what the limitations were, and how best to create a pleasing image that was also 3D.

Like this one:

One of the spots that overlooks Swan Falls and the Snake River Canyon, looking downstream from the dam.

Or this one, where the red rock formation just pops out at you:

Looking at Swan Falls Dam and the Snake River Canyon. The red rock is very prominent in the foreground.

What are the rules?

Aside from standard landscape photography composition “rules” I felt there must be some additional guidelines that would drive the composition.

As it turns out, there are, and there aren’t.

One of the things that you need to keep in mind when creating stereograms is:

  1. Take a picture of your subject. Remember where the center of your picture is on the subject.
  2. Take a step to the left. I usually stand with my feet just more than shoulder width apart. After taking the first picture, I move my right foot to touch my left foot then move the left foot so I am again standing with feet apart.
  3. Aim the camera at the exact same point on the subject as before.
  4. Take another picture.

That is my way of getting paired, handheld pictures. The first picture taken thus becomes the “right” picture (as in taken from the right) and the second becomes the “left” picture. The key to assembling the stereogram is the right picture goes on the left side and the left picture goes on the right side.

You can, of course, do it stepping to the left instead. In this case the first image becomes the left picture and the second becomes the right picture. No biggie, just get into the habit of doing it the same way each time.

There’s times a question arises whether or not the middle and far parts of the photograph actually show as 3D. Sometimes they do, sometimes they don’t. Sometimes they work if you have some decent foreground detail, other times you don’t need that foreground to make it work.

Willow Creek, off Black’s Creek Road. Notice the apparent differences in 3D impact in this compared to the Swan Falls stereograms.

And then, there’s the issue of anything that’s moving…that’s likely to produce “ghosts”, faint or translucent objects in the photo. Your main scene, the one you want to see in 3D, has to be still. Trees moving in the wind, clouds passing by overhead, cars on the road, people moving…all those and more need to be avoided.

One way to avoid natural movements such as clouds and water ripples is to use a long exposure time. That way, things get “smudged” smooth. Ripples on water, for example, become a soft flat surface and clouds become featureless.

Interestingly enough, I have one stereogram (below) where the two angles are such that one shows the parked truck and the other doesn’t, and yet the truck is solid in the stereogram view. It’s not a translucent ghost due to being in only one of the paired images. Yet another stereogram I’ve done freezes a car in one picture but it’s not in the other and this time it shows as a ghost car. Go figure. That’s what I mean about “there are and there aren’t additional guidelines.” More likely I haven’t figured them out yet.

Notice how the black truck is in one image but not the other, yet still comes through in the stereogram as solid, not as a ghost image.

By the way, the Wickahoney stereograms were all pulled from a video created by orbiting my DJI Phantom 4 around the midpoint of the ruins. You do remember that a video is merely a string of still images played back rapidly? Each pair, in this case, were pulled from about 1 second apart, e.g. one would be from 13 seconds into the video, and the second of the pair would be from 14 seconds into the video. When doing this, creating a stereogram from a video, you want to be sure the video still image isn’t blurry due to the drone moving too fast, to continue this example.

Stereogram from Wickahoney drone video.

One thing I did discover is that if you use a zoom or telephoto lens to enlarge something in the distance and make it part of your foreground or the middle distance in the photograph, you have to displace the camera location much more than a single step to one side. A problem I encountered was I could properly displace the distant solitary tree but the mountains behind it shifted significantly. They shifted enough that even though I could get the tree to be reasonably 3D, the more distant mountains were blurry.

A wide angle lens, though, works great and lets you really bring in some foreground:

Snake River Canyon from an overlook at Swan Falls, looking upriver from the dam.

And that’s as far as I’ve got. I’ll be going out and shooting more landscapes, as well as some closer subjects.

I think I know how to apply this technique to video as well and plan to try it with the video used to make the above Wickahoney stereograms. That’s for another time, though.

Time Passes

6 September 2017

It’s true, time passes faster when you’re having fun. It’s also true that if you stop doing something for a while, it’s much harder to resume. At the same time, at least for me, having that break with regular writing and playing on the radio makes it harder for me to start anything else since I have those two things hanging over my time. I want to get back into them, but…tomorrow? And then because of that attitude, I’m not eagerly starting new projects or doing other things that I want to. Not because I can’t or don’t have time, but because I have those two things hanging over me and I can’t get myself going on anything.

Milky Way from Coyote Grade in the Owyhee Foothills.

I did get out and do some Milky Way photography. It turned out OK, and I had a blast. With all the smoke lately, I haven’t gotten back out to shoot more astrophotography. Or any photography, actually. The drive just isn’t there and to be honest, I’m reluctant to just go by myself. Not because I can’t, but because I know my wife doesn’t like me going solo into the Owyhees. And I’ve sort of gotten used to having someone else along.

Excuses. Just excuses.

I do need to go back out with some settings from that shoot and try doing more and specific adjustments to the camera to attempt to get better images. I also want to take one of my telescopes out and use it to track the camera. That way, I can get some images to try stacking and see how that works to get a better Milky Way image.

In spite of that, I did do a bunch of prep for the recent total eclipse of 2017. For that, I built myself a solar filter out of PVC pipe and gold mylar sheet. A preliminary test showed the filter worked, but I am not happy with it and will probably eventually replace the mylar with newer or something else. It’s just not perfect.

My homemade solar filter mounted on the lens I will be using for the 2017 solar eclipse.

 

The sun as shot through my homemade solar filter. It’s two layers of gold mylar secured between an inner and outer PVC ring.

One of my sisters lived right in the path of totality, so it was a simple matter to head to her place 45 minutes away and observe the eclipse there.

My setup while shooting the 2017 Total Solar Eclipse. We had 100% totality here.

I did get at least one decent image that I was really happy with, and one I was sort of happy with. And yes, you can see sunspots in my images, so I guess there actually are other decent images from the event.

Bailey’s Beads from the 2017 Total Solar Eclipse. You can also see some solar prominences.

 

Another shot of Bailey’s Beads as the sun moved out of totality.

 

Sunspots during the 2017 Total Solar Eclipse, along with the moon encroaching. This is early in during the eclipse. They’re easier to see in the full size image.

The same time I did the Milky Way shoot, I took my DJI Phantom 4 along and did some flying. I had been mentally rehearsing the remote controller stick movements for a 180 flyby where you spin around 180 degrees without stopping. Or put another way, you’re flying away from a spot looking back at it and at some point along the flight path you spin around to face the other way while you continue flying in a straight line. I’m happy to report I got the stick movements right and it worked pretty well.

And now we have a Purple Air Quality alert for the valley. That’s the color for “very unhealthy” with the next color being brick red and meaning “hazardous”. So, I’ll leave you with one last image, one that I took last night and that I call the Fire Moon. The color is due to the smoke in the air, of course.

The nearly full moon, discolored from the heavy smoke covering Treasure Valley, Idaho.