Saturday, February 26, 2011

What I learned from the trip

Dear readers,

In no particular order,
  1. When planning for a long cycling it is advisable to leave early and work it within natural solar time limits. We overshot it by 40 mins.
  2. No matter what, stick to the schedule. Even if your friend calls and says "...wait for half an hour, we'll come on a motorbike and then we can leave together", don't follow his advise. You'll be the one on a cycle and natural light won't show any mercy. Thankfully our friends did not come.
  3. There is a very small window available for nature photography. For greens and colourful flora, it is between 8-10 am and somewhere about 4:30-5 pm during this time of year in the norther tropical region.
  4. Water bodies have the best contrast and vibrancy during midday.
  5. During afternoon to evening to sunset the exposure value decreases rapidly from a 14 to about 9.
  6. In small settlements, a single shop owner has to give multiple services. I saw a shop that was a photo studio, print shop, photocopy shop, mobile repair centre and mobile and DTH recharge centre all rolled into one.
  7. Just like hilly terrain treks, carry sufficient carbohydrates or make sure you have some from shops at regular intervals. This cannot reduce fatigue but will propel the body. We had sugarcane juice, biscuits, deep fried moong, chocolate, rice + fish, momo and tea.
  8. A good backpack and a comfortable pair of hiking shoes goes a long way. Rahul desperately needs a good hiking backpack.
  9. Wear thick trousers like jeans to reduce friction between the seat and your skin. ...and we were in our thin shorts while returning.
  10. Make use of natural slopes as much as possible. Keep driving the pedal to gain some momentum. This will help the ride during the up-slope.
  11. The same down-slope will kill you during the return journey by becoming an up-slope.
  12. Enjoying the ride is the key. Stop at as many places you find interesting. The destination is unimportant.

Hostel to Bhasra Ghat (and back)

Dear readers,

This was a 67 km trip on cycle. With all the fooding and photoshoot breaks and the side trips we took, it took us about 10 hours - 8:45 am to 6:50 pm. The actual timing is between 3 hrs 10 mins to 3 hrs 30 mins for one side of the journey.

Here is the map. Zoom out to see the route view.

View Larger Map


Friday, February 25, 2011

A note to myself (stdio.h not found)

Dear readers,

Default installation of gcc via Ubuntu repository skips a very important package : build-essentials

Solution to:
fatal error : stdio.h not found.


Monday, February 21, 2011

Link to Picassa Web Album

Dear reades,

In case you wish to see the scans referred to in this post :

Film scans : Dec '10 to Feb '11

I will be arranging and backing up the HI-RES digital scans tomorrow.


Sunday, February 20, 2011

On 35mm C41 film processing (in India)

Dear readers,

Gear : I use a borrowed Vivitar 3000S with a 50mm f/1.7 lens. No flash. I usually use ISO 400 films, mostly Kodak Max, sometimes Fuji. My choice of sensitive films is governed by my inclination towards low light photography.

The last few times, I was utterly disappointed by the prints that I was handed over by the local studios. They had quite a few problems.
  1. The films were sliced carelessly into strips of 4s. In some cases the previous strip had about 20% of the next film's exposed width. A damage like this is irreparable.
  2. The print had substantial image missing. The printed area was about 80% of the actual image.
  3. The prints had horizontal lines across them as if the brushes of the paper feed roller had scratched them. When I complained, the studio said that there were lines on the negative itself. I seriously started doubting my gear.
  4. The colours were outright unnatural. There was way too much saturation.
I had asked Rahul Mehta to get the last cartridge printed from Delhi. Some of the images were amazing. This boosted my confidence, put my faith back in analog and proved that the negatives were fine.

I decided to ditch that workflow altogether.

Firstly, I had no chemicals nor an access to a dark room to develop film. I asked a local studio to develop the cartridge and give me the entire reel uncut. Then I got hold of a flatbed film scanner in my department and started scanning. (Doing that right now). Simultaneously I am preparing print-ready pdfs post colour-correction and framing. (Simultaneously writing this journal entry, too, since the scanner takes a lot of time to scan films at 2400ppi).

This has exposed many things that would otherwise go unnoticed had I not dived hands-on into the process.
  1. The film scanners take the entire 35mm exposed area into account. This in itself is a leaps and bounds improvement over the DX windowed prints I had received earlier.
  2. Each exposure requires some attention to get a print-ready file. Some require a basic curve adjustment while others require meticulous control over shadows, midtones and highlights separately. Some even require selective retouch that can be analogous to selective under and overexposing of halide sheets. The cookie cutter approach of labs is not satisfactory at all.
  3. The lens is not as sharp as a Nikkor or Canon 50mm f/1.8D. But we must also take into account that this lens is damn cheap - the body+lens cost the original owner INR 3.5k while The Nikkor lens itself is INR 6.3k. A marginally better lens 50mm f/1.4D is INR 16.7k. The Canons are even more expensive.
  4. When objects are far away and light is low, ISO 400 films show significant grain - so much so that at times it is difficult to even identify the object / person. Coupled with a less sharp lens, this is a big problem.
  5. 1 EV underexposure can be corrected in post processing at the cost of dynamic range. But a 1 EV overexposure is impossible to fix. The brackets in artificial light (specifically, sodium and mercury floodlights) situations can be +0.5, 0.0, -1 EV.
  6. Sometimes accidental motion gives unexpected 'unnatural results'. I managed to get two of them in this cartridge.
  7. There is a 'feel' factor associated with film images that is nearly impossible to get in digital. The jitters, grain, dust and speckles all add to that. In my opinion, in arts, character must take precedence over accuracy. It took me 4-5 hours to get the entire reel done (and simultaneously write this), but it's worth it.

Progress on Thesis : Update 3

Dear readers / myself,

All major chapters are over except
Chapter 2 : Literature Review's Objective of the present work
Chapter 5 : Result and Discussion's Effects of layerwise damage in
  1. Free vibration and buckling
  2. Simple and combination resonance characteristics
  3. Follower force flutter characteristics
Chapter 6 : Conclusion (including future scope)


Saturday, February 19, 2011

My rants on Mixtape India Volume 4

Dear readers,

I believe that a musician or an artist should never be a critic and I have been trying not to criticise stuff ever since that realisation has dawned over me. Today I will make an exception. The reason is very simple - it is my engineered work that has been butchered.

Mixtape India Volume 4 contains a song from an album I had engineered - In Human's Voices (track 8) [Their myspace page streams the entire album for free]. I do not know who makes these mixtapes but I plead that whoever it is should take a temporary break from the world of sonic engineering - especially, mixing and mastering until he or she has learned the science behind it. The art develops with the aesthetic sense in due time but the science doesn't.

Let me put things in perspective. The download offered is 128kbps MP3 - a far cry from audiophile standards but we must relax our parameters since the host has to keep in mind the bandwidth of the server as well as people on low speed connections.

...but here are the points (and mostly judged on the sad state of track 8)
  1. The choice of codec is definitely not the best. He could have given ogg files at 128kbps avg bit rate. A 128 kbps ogg is sonically similar to a 160kbps mp3. To top it all 9 out of 10 tracks converted to WAV with major problems. There were reports of MPEG stream error at XXXX bytes every now and then. I am certain that it is not a download fault as the package is a zip file and ZIP algo has a checksum built in.
  2. The ReplayGain of the tracks were between -7.8 to -9.2 with track peaks at 1.0 throughout. What does that mean in layman's terms? That the music is whimpy loud* and the tracks have reached the digital maximum of 1 or 0xFFFF on a 16bit mono channel. Well the following image tells everything

    *N.B. Loud refers to the loudness achieved by reducing the dynamic range and increasing the gain.
  3. Right at the In Human track's 0:35 mark and 0:38 mark the sound gives away the tell-tell sign of mindless compression. As soon as the guitar on the left channel goes solo, there is a sudden volume boost. And let me tell you that this is not the threshold-ratio-makeUpGain kind of compression we are talking about, which everyone uses to bring in character, colour, control and warmth. It is the threshold-makeUpToGain types. It could very well be a hybrid limiter or worse, a multiband compressor/limiter.
  4. A sample analysis will tell that this is the curve he or she has gone for

    While this is acceptable, a flat cookie cutter approach throughout is not. That In Human track has lost it's clarity and is only left with dullness and muddiness, not to mention a bitter sense of agony in me.
To all who really wish to enjoy the original mixes, please ask any of the band members for a CD or send a mail at

The CD was not mastered due to budget restrictions but it was not shoddily put together either. The CD will still sound clear and punchy. If you have a sound system that can handle the SPL, turn the volume knob clockwise. These guys are amazing musicians and I loved working with and for them.


Thursday, February 17, 2011

Searching for a new DAW

Dear readers,

Before I start, I must explain that the versions I am writing about are old ones and may not reflect the current trends in the respective DAWs.

I have been a big fan of Nuendo 3. It did what it did and it did that well. In earlier days of the In the Box or ITB audio production, the choice of a multitrack software was virtually non-existent. In those days I used n-Track. I don't remember the exact version but I remember hating its look, its controls and worst of all, its workflow. So much so, that after I had recorded the tracks, I exported stems and mixed them separately in Sony Vegas. Yup! a video editing software that had marginally better multitracking and routing capabilities than n-Track (Current version Studio 6.1.1).

An year and half later, I came across Nuendo 3. Back then I did not know that Nuendo was the big brother of Cubase and that the former was a video/audio post production suite while the latter was Steinberg's flagship DAW. I was quite happy with the software and went on to produce and record albums on it. Few years later, I had a chance of getting my hands onto Nuendo 4. What I stater hating about it was the fact that writing music is a painful action on Nuendo 4. Back in the old days, I loved the liberation Nuendo 3 had given me but the same feature rich baggage was something that is a composer's nightmare. The splash screen stays on for eternity. I have nothing against Cubase/Nuendo as a production software. I just don't believe it is a good tool for writing and scribbling ideas. (Current version Cubase 6/Nuendo 5)

I am well aware that many bands like Linkin Park work with Pro Tools and do the same job faster and better. (Two points here- a)I don't like them much except for their first album. b)They are rich dudes). See here and here

So, I'm off to test the first candidate for the new decade. REAPER.


Thursday, February 10, 2011

Not going to Bangalore

(Keeping this short)

Dear readers,

Cancelled Bangalore trip. At home on a much needed break. Off the grid 'till 15th Feb.


Tuesday, February 8, 2011

The Google doodle Jules Verne alignment

Dear readers,

This is the alignment of Google doodle (Feb 8, 20111). Near the bottom-left of the ocean, the ferns spell out Google.


Things to do before leaving for Bangi

Bear(sic) readers (for myself),

Wash clothes
Give formals for ironing
Collect notebooks
Check notebooks (4 of 8 done)
Place them in the structures lab
Take printouts of tickets
Take printouts of HR form
Fill HR form
Inform TA incharge of absence
Isolate MMS results for thesis
Update plan/status of thesis


Sunday, February 6, 2011

Random cycling trip to Midnapore

Dear readers,

This was appx. 37km, 4 hours trip inclusive of about 4 breaks spanning 20 mins.

(Zoom out to see the route on the map)

View Larger Map


Taming the Sudoku

Step 1: The Deterministic Sudoku

Dear readers,

After a few days of relative unproductivity, I tried my hands on a disjoint problem hoping to see/sense a revival of interest in my thesis writing endeavour. I picked up Sudoku.

I am not an avid Sudoku lover neither do I intend to be one. Whenever I went home, I often found my father immersed in The Telegraph - t2's Sudoku page. He did a lot of guesswork. I always pointed out that it could be solved using simple elimination, substitution methodology. Little did I know that Sudoku was actually a nondeterministic (ND) problem*.

Yesterday, I wrote a nifty solver for solving Sudoku. It is efficient and runs pretty fast. However, it only works for deterministic Sudokus - ones that are labelled Easy and Moderate by the newspapers/magazines. (I did not even know they were supposed to be ND as well).


The zip file also contains some input files that are downright 'evil'. The solver could only solve two cells of the 'extreme' grid. The corresponding output is piped to *_out.txt

You'll have to compile the source 'sudoku_solver.c' yourself. Open this file in a text editor to read the instructions**.

Most sources that can solve ND Sudokus are very slow. I'll have to read a bit to mod this and make a fast ND solver for such Sudokus. I may not be even working on them for a very long time. There may not be even a step 2. My thesis is my priority.


*For all the purists, I am not calling upon the complexity clause and putting a "polynomial time" suffix to it. I know of it's NP nature, but at this point I am least bothered about an optimised solution algorithm.

**Copying the instructions from the file:

Compile this program using the following syntax
gcc sudoku_solver.c -o sudoku_solver.out
Run it using the following command
./sudoku_solver.out <input_filename>

The contents of the input file must be of the form
0 0 0 8 0 0 5 0 2
6 0 9 0 3 0 4 1 0
3 8 5 4 0 0 0 0 6
0 0 0 2 0 0 1 5 9
0 6 0 0 0 0 0 2 0
9 1 2 0 0 3 0 0 0
2 0 0 0 0 8 9 7 5
0 5 4 0 9 0 8 0 1
8 0 1 0 0 5 0 0 0
to represent the following unsolved Sudoku
| | 8 | 5 2 |
| 6 9 | 3 | 4 1 |
| 3 8 5 | 4 | 6 |
| | 2 | 1 5 9 |
| 6 | | 2 |
| 9 1 2 | 3 | |
| 2 | 8 | 9 7 5 |
| 5 4 | 9 | 8 1 |
| 8 1 | 5 | |