Drupal SSO: An unsupported mechanism was requested…

Here is a step that may help You to debug the error above.

Today I was setting up SSO on a Drupal page against MS AD. Something went wrong and I found the following message in site’s error_log:

gss_accept_sec_context() failed: An unsupported mechanism was requested (, Unknown error)

Let’s modify a little bit apache’s config, and add this to appropriate place (global or vhost level):

 LogLevel debug

Restart apache and check the log again:

kerb_authenticate_user entered with user (NULL) and auth_type Kerberos
Acquiring creds for HTTP/intranet.kesz.hu@KESZ.HU
Verifying client data using KRB5 GSS-API
Client didn't delegate us their credential
Warning: received token seems to be NTLM, which isn't supported by the Kerberos module. Check your IE configuration.
GSS-API major_status:00010000, minor_status:00000000
gss_accept_sec_context() failed: An unsupported mechanism was requested (, Unknown error)

So the real problem was: Warning: received token seems to be NTLM, which isn’t supported by the Kerberos module. That is much more informative than the unknown error we had before!

Dont forget to set back LogLevel after You finished because the log file fast becomes really large…

hpacucli error: no controllers detected.

I would like to check my P400 controller inside a HP DL180 G5 ProLiant server which was running x64 Linux with kernel. HP has a utility named hpacucli designed for this kind of task so I was looking for appropriate download link. After found some versions (I newer understood the logic behind HP’s software version numbering) and downloaded them I realized that neither of them was seeing my P400 controller:

root@xerxes:~/opt/compaq/hpacucli/bld# ./.hpacucli ctrl all show

Error: No controllers detected.

The Internet is full with solutions to this problem:
– Use uname26 utility to hide Your 3.x kernel! But I have 2.6…
– modprobe sg before hpacucli! Sg was compiled into kernel already…
– etc…

After some hours of trials I was strace-ing the hpacucli command and found that it tries to open it’s libcpqimgr which was placed in the same directory but not in LD_LIBRARY_PATH, so it didnt found it. So “No controllers detected” actually means “Hey guy, I didnt find my right hand which I can use for detecting controllers!”. Nice.

Little modification to command line:

root@xerxes:~/opt/compaq/hpacucli/bld# LD_LIBRARY_PATH=. ./.hpacucli ctrl all show

Smart Array P400 in Slot 6                (sn: PAFGL0M9VWK002)

I understand that if I installed the package and used the appropriate script provided by HP I never met this situation. So this is an unexpected thing. But maybe others run into same so remember: strace is (one of) Your best (non human) friends!

CLR20r3 FileLoadException and Why Keep Settings from .config?

Today one of my colleagues tried to start a newer version of one of or .Net tools on her Win7 computer which was just copied from deployment share.

The app didn’t started, only Windows Error Reporting was doing something on the systray. In the Eventlog there was an error 22 with CLR20r3 and mentioning a FileLoadException. P4’s value: PresentationFramework. Short check about installed frameworks, etc.: everything seemed to be fine. Nothing useful was found in generated WER file either.

The tool was running well on other’s machine. What happened with her’s?

In the app’s .config file there were custom app settings in the appSettings section that’s why we keeped the original file from previous installation. The problem was that in .config there was an assemblyBinding section with a bindingRedirect too:

    <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">
        <assemblyIdentity name="AnAssembly" publicKeyToken="123123123123" culture="neutral" />
        <bindingRedirect oldVersion="" newVersion="1.0.33334.0" />

The new code had a newer version of the AnAssembly which wasnt used because of the bindingRedirect above! The new .config had the updated version numbers, but we overwrite it with the previous version of file because we would like to keep the correct appSettings values. It is very handy to use the appSettings section in the .config but it is a bad idea because of framework configuration is keeped in the same place. Keeping it separately seems better idea.

PS.: What about PresentationFramework in the P4 value? Completely missleading info…

Jenkins: Publish Over CIFS plugin

If You run jenkins on windows machine I may save You some hours.

I wanted to copy some files from my job’s workspace to a network share which needed authentication. So XCOPY wasn’t enough, I looked for some other solution. The Publish Over CIFS plugin can handle this situation, so I installed and configured it and added a new “Send build artifact to windows share” post build step to my job.

After a quick build delay I found these lines at the end of the build’s console output:

CIFS: Connecting from host [atlas]
CIFS: Connecting with configuration [fileserver DEPLOY share] ...
CIFS: Removing WINS from name resolution
CIFS: Setting response timeout [30 000]
CIFS: Setting socket timeout [35 000]
CIFS: cleaning [smb://fileserver.mecset.local/DEPLOY/ArdinTemplatingRedmineConnector/]
CIFS: Disconnecting configuration [fileserver DEPLOY share] ...
CIFS: Transferred 0 file(s)

0 files transferred? I naturally have some result files, so I checked paths, etc. No errors but no files copied.

After a short 2 hours of trial-and-errors I realized: despite the hosting windows isn’t the plugin IS CASE SENSITIVE! Grrrrr…

My source files were set to:


After changed to:


everything worked fine.

Nothing was written on plugins wiki pages about this so if some source engine bringed You here I hope it helps.

LINQ to Object: Skip() performance

Assume You have a List of 5000000 items. You want to do some paging so You need e.g. 100 items from offset 300000. Check these two implementations:

        public List<T> GetItems1<T>(List<T> source, int offset, int count)
            List<T> ret = new List<T>();

            for (int i = offset; i < offset + count; i++)

            return ret;
        public List<T> GetItems2<T>(List<T> source, int offset, int count)
            List<T> ret = source.Skip(offset).Take(count).ToList();

            return ret;

What do You think, which performs better? You may say the second one of course. The first one indexes the source list and calls Add() method count times. The second simply enumerates once till offset then returns count items as a new List with the possibility of internal item addition optimalization. At least that were what I think.

But I was wrong!

The second implementation always slower. The magnitude depends on the offset value but it is always slower!

offset GetItems1 GetItems2
0 43 65
10000 59 729
100000 44 5162
1000000 42 52057
3000000 44 147608

The reason is inside the implemetation details of List and IEnumerable.Skip(). The first one knows where to find the nth item, the second one should enumerate to it. The conclusion as one of my colleagues pointed out: use as specialized tools as You can.

The code which I used for result above:

    public class Test
        private int cycles = 100;
        private int offset = 0;
        private int count = 100;

        public void perftest1()
            var l = GetTestData();

            var sw = new Stopwatch();

            double r = 0;
            for (int i = 0; i < cycles; i++)
                var result = GetItems1(l, offset, count);
                r += sw.ElapsedTicks;

            Assert.Fail((r / cycles).ToString());

        public void perftest2()
            var l = GetTestData();

            var sw = new Stopwatch();

            double r = 0;
            for (int i = 0; i < cycles; i++)
                var result = GetItems2(l, offset, count);
                r += sw.ElapsedTicks;

            Assert.Fail((r / cycles).ToString());

        public List<T> GetItems1<T>(List<T> source, int offset, int count)

        public List<T> GetItems2<T>(List<T> source, int offset, int count)

        private List<int> GetTestData()
            List<int> ret = new List<int>();

            for (int i = 0; i < 5000000; i++)

            return ret;

Redmine, awesome_nested_set, Issue.rebuild!, lock_version and ActiveRecord::StaleObjectError

Today I got a task to restructure about 400 issues in our Redmine instance. I should create a hierarchy and reparent existing issues under new ones. All the issues have a lot of related ones. But when I tried to modify parent issue field I got “Parent task is invalid” message after saving. I found pages about this problem and the only solution: remove relations, reparent the issue and recreate relations… 400 issues with a lot of relations… No.

I was updating the database instead: set the issues.parent_id to the appropriate new value. Naturally after that two things went wrong: some aggregated values e.g. estimated_hours and the hierarchy view, because of awesome_nested_set gem was used which stores hierarchy speedup values in rgt and lft fields of that table. First of all I fired out these fields in database (11840 was my main root issue):

UPDATE issues 
WHERE parent_id=11840

Than the following command needs to run in redmine root dir:

RAILS_ENV=production ruby script/rails runner Issue.rebuild!

After a long run, error occured:

ActiveRecord::StaleObjectError: ActiveRecord::StaleObjectError
from /var/lib/gems/1.8/gems/activerecord-3.2.19/lib/active_record/locking/optimistic.rb:90:in `update'
from /var/lib/gems/1.8/gems/activerecord-3.2.19/lib/active_record/attribute_methods/dirty.rb:74:in `update'
from /var/lib/gems/1.8/gems/activerecord-3.2.19/lib/active_record/timestamp.rb:71:in `update'
from /var/lib/gems/1.8/gems/activerecord-3.2.19/lib/active_record/callbacks.rb:272:in `update'
from /var/lib/gems/1.8/gems/activesupport-3.2.19/lib/active_support/callbacks.rb:403:in `_run__1446014408__update__4__callbacks'
from /var/lib/gems/1.8/gems/activesupport-3.2.19/lib/active_support/callbacks.rb:405:in `send'
from /var/lib/gems/1.8/gems/activesupport-3.2.19/lib/active_support/callbacks.rb:405:in `__run_callback'
from /var/lib/gems/1.8/gems/activesupport-3.2.19/lib/active_support/callbacks.rb:385:in `_run_update_callbacks'
from /var/lib/gems/1.8/gems/activesupport-3.2.19/lib/active_support/callbacks.rb:81:in `send'
from /var/lib/gems/1.8/gems/activesupport-3.2.19/lib/active_support/callbacks.rb:81:in `run_callbacks'
from /var/lib/gems/1.8/gems/activerecord-3.2.19/lib/active_record/callbacks.rb:272:in `update'
from /var/lib/gems/1.8/gems/activerecord-3.2.19/lib/active_record/persistence.rb:348:in `create_or_update'
from /var/lib/gems/1.8/gems/activerecord-3.2.19/lib/active_record/callbacks.rb:264:in `create_or_update'
from /var/lib/gems/1.8/gems/activesupport-3.2.19/lib/active_support/callbacks.rb:590:in `_run__1446014408__save__4__callbacks'
from /var/lib/gems/1.8/gems/activesupport-3.2.19/lib/active_support/callbacks.rb:405:in `send'
from /var/lib/gems/1.8/gems/activesupport-3.2.19/lib/active_support/callbacks.rb:405:in `__run_callback'
... 18 levels...
from /var/lib/gems/1.8/gems/activerecord-3.2.19/lib/active_record/relation/delegation.rb:6:in `__send__'
from /var/lib/gems/1.8/gems/activerecord-3.2.19/lib/active_record/relation/delegation.rb:6:in `each'
from /var/lib/gems/1.8/gems/awesome_nested_set-2.1.6/lib/awesome_nested_set/awesome_nested_set.rb:203:in `rebuild!'
from /var/lib/gems/1.8/gems/awesome_nested_set-2.1.6/lib/awesome_nested_set/awesome_nested_set.rb:213:in `call'
from /var/lib/gems/1.8/gems/awesome_nested_set-2.1.6/lib/awesome_nested_set/awesome_nested_set.rb:213:in `rebuild!'
from /var/lib/gems/1.8/gems/activerecord-3.2.19/lib/active_record/relation/delegation.rb:6:in `each'
from /var/lib/gems/1.8/gems/activerecord-3.2.19/lib/active_record/relation/delegation.rb:6:in `__send__'
from /var/lib/gems/1.8/gems/activerecord-3.2.19/lib/active_record/relation/delegation.rb:6:in `each'
from /var/lib/gems/1.8/gems/awesome_nested_set-2.1.6/lib/awesome_nested_set/awesome_nested_set.rb:210:in `rebuild!'
from /var/lib/gems/1.8/gems/activerecord-3.2.19/lib/active_record/scoping/default.rb:41:in `unscoped'
from /var/lib/gems/1.8/gems/activerecord-3.2.19/lib/active_record/relation.rb:241:in `scoping'
from /var/lib/gems/1.8/gems/activerecord-3.2.19/lib/active_record/scoping.rb:98:in `with_scope'
from /var/lib/gems/1.8/gems/activerecord-3.2.19/lib/active_record/relation.rb:241:in `scoping'
from /var/lib/gems/1.8/gems/activerecord-3.2.19/lib/active_record/scoping/default.rb:41:in `unscoped'
from /var/lib/gems/1.8/gems/awesome_nested_set-2.1.6/lib/awesome_nested_set/awesome_nested_set.rb:185:in `rebuild!'

After some googling I modified my command:

RAILS_ENV=production ruby script/rails console
irb(main):003:0> Issue.rebuild!

Same error, but instead of long nothinghappens block a lot of SQL statements were scrolling on my screen which indicate what is happening. Here are the interesting last ones:

   (0.2ms)  UPDATE `issues` SET `lock_version` = 53, `estimated_hours` = NULL, `updated_on` = '2014-11-25 12:29:00' WHERE (`issues`.`id` = 12751 AND `issues`.`lock_version` = 52)
   (0.1ms)  UPDATE `issues` SET `lock_version` = 53, `lft` = 388, `updated_on` = '2014-11-25 12:29:02', `rgt` = 419 WHERE (`issues`.`id` = 12751 AND `issues`.`lock_version` = 52)
   (0.1ms)  ROLLBACK

So it seems that the ActiveRecord contexts used by Redmine and nested set’s rebuilder differ: Redmine collects info about aggregated values and updates the record and the other one wants to update the hierarchy fields. Optimistic concurrency fails here.

My solution: stop Redmine and run this command in ruby console before Issure.rebuild!:

irb(main):003:0> ActiveRecord::Base.lock_optimistically = false

This will turn off concurrency locking in Your process so script will run. It seems the fields updated are not overlapped so running without locking is not a problem.

NuGet silently overwrites some files

On package update NuGet silently overwrites some files. Consider You have something like this in Your nuspec file:

    <file src="MyPackage\App.xaml.cs.pp" target="content\App.xaml.cs.pp" />
    <file src="MyPackage\App.xaml.pp" target="content\App.xaml.pp" />

Consider the user changed App.xaml.cs but didnt changed App.xaml. In this case when user updates Your package the App.xaml.cs file will be silently overwritten!

After a short search session on the net I found a same problem without any solutions. So here is mine install.ps1 with a workaround:

param($installPath, $toolsPath, $package, $project)
$appXamlFileName = $item.Properties.Item("FullPath").Value
Add-Content $appXamlFileName "<!-- this comment added by nuget package's install script to modify this file to workaround bug which overwrites App.xaml.cs file on next package update if this one isnt modified -->" 

On update NuGet will detect that App.xaml changed and will skip it AND App.xaml.cs file too!

Howto debug NuGet package’s install.ps1

While searching for solution around install.ps1 I quickly realised that the edit file -> create package -> publish -> update target cycle isnt very handly. There are step-by-step instructions on the net how to debug install.ps1, but when I tried to use them I run into new problems about incompatibility between my packages and nuget powershell cmdlets.

So I used a poor mans solution instead:

  1. Add some dummy content file to You package.
  2. Install Your package.
  3. Edit that dummy file (each next package install cycle will ask You about overwriting and that is what we need!)
  4. Open Package Manager Console (PMC) in VS
  5. Execute “Set-PsDebug -trace 2”. It helps You later.
  6. Open the install.ps1 in some editor (I used PowerShell ISE) from target project’s packages directory.
  7. In PMC run “update-package _yourpackagename_ -reinstall”. The installation will stop because file overwriting.
  8. In the PS editor edit and save install.ps1, but dont close it.
  9. Back to PMC and answer the overwriting question (with ‘L’ or ‘N’ to keep the modified file).
  10. See the results.
  11. Go to 7.

Not to smart, didnt uses tools and libraries but it works in acceptable cycle time.

Setting project item’s BuildAction from NuGet package

I created a package for internal use which has an App.xaml file in content. Naturally I would like to find it in target project with BuildAction set to “ApplicationDefinition”, but Visual Studio treats it as “Page”, because it is the xaml extension default.

I found a hopeful solution here promising a fast track. My first version of install.ps1 was this:

param($installPath, $toolsPath, $package, $project)

$item = $project.ProjectItems.Item("App.xaml")
$item.Properties.Item("BuildAction").Value = ???

The problem raised when I didnt found the “ApplicationDefinition” value in prjBuildAction enumeration on MSDN…

I found some similar examples on the net. Some of them have comment with a question: what to do if someone wants to set a value not in the enumeration? Neither of those comments has an answer which distressed me a bit.

Here I found a clue so I tried to enumerate those undefined prjBuildAction values with this code:

param($installPath, $toolsPath, $package, $project)

Add-Type -AssemblyName 'Microsoft.Build.Engine'
$msbuildproject = new-object Microsoft.Build.BuildEngine.Project

[System.Collections.ArrayList]$buildActions = "None", "Compile", "Content","EmbeddedResource"

$msbuildproject.ItemGroups | Where-Object { $_.Name -eq "AvailableItemName" } | Select-Object -Property "Include" | ForEach-Object {
  $act = $_

$item = $project.ProjectItems.Item("App.xaml")
$item.Properties.Item("BuildAction").Value = [int]$buildActions.IndexOf("ApplicationDefinition")

Dont know why but the enumeration of the ItemGroups didnt worked. When I did a

Write-Host ($msbuildproject.ItemGroups | Format-List | Out-String)

it showed me a nice list of BuildItems, but when I run with a Where-Object against it I found nothing. The problem should be in PS syntax or the object instances. I dont maintain my PS knowledge which is based on my .Net and Linux scripting practice combined with looking for snippets from code examples around. I simply dont want to go more deeply into it because I feel PS is something “created” and not “born” if You understand what I mean.

I rewrite the script to get the available BuildAction values as follows:

$msbuildproject.ItemGroups | ForEach-Object {   
    $ig = $_
    @($ig.GetEnumerator()) | ForEach-Object { 
        $i = $_
        if ($i.Name -eq "AvailableItemName")

And voilá, I got a nice list of values in $buildActions:


My $idx becomes: 5, I checked value in Solution Explorer… and found “CodeAnalysisDictionary” there. No problem, it should be some 0/1 based indexing problem, lets set $idx+1, run, check, value okay! Lets try something other just for to be sure… again bad value! Unfortunately it seems the order of values collected with this algorithm isnt good as the link I mentioned above imply. Back to the start line.

While looking for solutions I somewhere found something else about ProjectItem’s “ItemType” property, so I tried to play with it. And suddently the Sun raised, the sky becomes blue, etc.:

param($installPath, $toolsPath, $package, $project)
$item = $project.ProjectItems.Item("App.xaml")
$item.Properties.Item("ItemType").Value = "ApplicationDefinition"

So simple and it works!

Error 286 The “BuildShadowTask” task failed unexpectedly. System.NullReferenceException: Object reference not set to an instance of an object.

One of my colleagues meets the message above during building solution which has a test project. Because the solution he found isnt trivial I decided to distribute it here too:

Refresh accessors in Test References or add if some of them are missing.

Using accessors is obsolete now, but if You have an ancent project with them which wont compile with this error I hope You find useful this info.