From mboxrd@z Thu Jan  1 00:00:00 1970
Received: from pigeon.gentoo.org ([208.92.234.80] helo=lists.gentoo.org)
	by finch.gentoo.org with esmtp (Exim 4.60)
	(envelope-from <gentoo-commits+bounces-348098-garchives=archives.gentoo.org@lists.gentoo.org>)
	id 1QQ12r-0005u2-Vp
	for garchives@archives.gentoo.org; Fri, 27 May 2011 17:41:58 +0000
Received: from pigeon.gentoo.org (localhost [127.0.0.1])
	by pigeon.gentoo.org (Postfix) with SMTP id E2C331C153;
	Fri, 27 May 2011 17:41:49 +0000 (UTC)
Received: from smtp.gentoo.org (smtp.gentoo.org [140.211.166.183])
	by pigeon.gentoo.org (Postfix) with ESMTP id 6C9471C153
	for <gentoo-commits@lists.gentoo.org>; Fri, 27 May 2011 17:41:49 +0000 (UTC)
Received: from pelican.gentoo.org (unknown [66.219.59.40])
	(using TLSv1 with cipher ADH-CAMELLIA256-SHA (256/256 bits))
	(No client certificate requested)
	by smtp.gentoo.org (Postfix) with ESMTPS id EC1381B4001
	for <gentoo-commits@lists.gentoo.org>; Fri, 27 May 2011 17:41:48 +0000 (UTC)
Received: from localhost.localdomain (localhost [127.0.0.1])
	by pelican.gentoo.org (Postfix) with ESMTP id A5D878050A
	for <gentoo-commits@lists.gentoo.org>; Fri, 27 May 2011 17:41:47 +0000 (UTC)
From: "Fabian Groffen" <grobian@gentoo.org>
To: gentoo-commits@lists.gentoo.org
Content-type: text/plain; charset=UTF-8
Reply-To: gentoo-dev@lists.gentoo.org, "Fabian Groffen" <grobian@gentoo.org>
Message-ID: <abd5adceac4b5f95de66be7516d4f29f24f00d02.grobian@gentoo>
Subject: [gentoo-commits] proj/portage:prefix commit in: /
X-VCS-Repository: proj/portage
X-VCS-Committer: grobian
X-VCS-Committer-Name: Fabian Groffen
X-VCS-Revision: abd5adceac4b5f95de66be7516d4f29f24f00d02
Date: Fri, 27 May 2011 17:41:47 +0000 (UTC)
Precedence: bulk
List-Post: <mailto:gentoo-commits@lists.gentoo.org>
List-Help: <mailto:gentoo-commits+help@lists.gentoo.org>
List-Unsubscribe: <mailto:gentoo-commits+unsubscribe@lists.gentoo.org>
List-Subscribe: <mailto:gentoo-commits+subscribe@lists.gentoo.org>
List-Id: Gentoo Linux mail <gentoo-commits.gentoo.org>
X-BeenThere: gentoo-commits@lists.gentoo.org
Content-Transfer-Encoding: quoted-printable
X-Archives-Salt: 
X-Archives-Hash: 61922e1385fb0b2335b052497f4321f1

commit:     abd5adceac4b5f95de66be7516d4f29f24f00d02
Author:     Fabian Groffen <grobian <AT> gentoo <DOT> org>
AuthorDate: Fri May 27 17:39:37 2011 +0000
Commit:     Fabian Groffen <grobian <AT> gentoo <DOT> org>
CommitDate: Fri May 27 17:39:37 2011 +0000
URL:        http://git.overlays.gentoo.org/gitweb/?p=3Dproj/portage.git;a=
=3Dcommit;h=3Dabd5adce

Merge remote-tracking branch 'overlays-gentoo-org/master' into prefix

Ported changes to LinkageMapELF to the other LinkageMaps

Conflicts:
	bin/etc-update
	bin/glsa-check
	bin/regenworld
	pym/portage/dbapi/vartree.py


 bin/ebuild.sh                                      |    7 +
 bin/etc-update                                     |    8 +-
 bin/glsa-check                                     |    4 +-
 bin/regenworld                                     |    6 +-
 bin/repoman                                        |    2 +-
 cnf/make.conf.sparc.diff                           |    2 +-
 doc/package/ebuild/eapi/4.docbook                  |    4 +-
 make.conf.txt                                      |  719 --------------=
------
 man/ebuild.5                                       |    4 +-
 man/emerge.1                                       |   15 +-
 man/make.conf.5                                    |    5 +-
 pym/_emerge/AsynchronousLock.py                    |   49 ++-
 pym/_emerge/AsynchronousTask.py                    |    5 +-
 pym/_emerge/Binpkg.py                              |    3 +-
 pym/_emerge/BinpkgFetcher.py                       |    7 +-
 pym/_emerge/Blocker.py                             |   12 +-
 pym/_emerge/BlockerDB.py                           |    9 +-
 pym/_emerge/DepPriority.py                         |    2 +-
 pym/_emerge/DepPrioritySatisfiedRange.py           |   31 +-
 pym/_emerge/EbuildBuild.py                         |    3 +-
 pym/_emerge/EbuildBuildDir.py                      |   29 +-
 pym/_emerge/EbuildMerge.py                         |   23 +-
 pym/_emerge/EbuildPhase.py                         |   34 +-
 pym/_emerge/FakeVartree.py                         |   21 +-
 pym/_emerge/Package.py                             |   59 ++-
 pym/_emerge/PackageUninstall.py                    |  102 +++-
 pym/_emerge/Scheduler.py                           |   24 +-
 pym/_emerge/Task.py                                |   31 +-
 pym/_emerge/actions.py                             |   30 +-
 pym/_emerge/depgraph.py                            |  617 ++++++++++++++=
----
 pym/_emerge/help.py                                |   16 +-
 pym/_emerge/main.py                                |   17 +-
 pym/_emerge/resolver/backtracking.py               |    7 +-
 pym/_emerge/resolver/output_helpers.py             |    4 +-
 pym/_emerge/unmerge.py                             |   74 ++-
 pym/portage/const.py                               |    2 +-
 pym/portage/cvstree.py                             |    6 +-
 pym/portage/dbapi/_MergeProcess.py                 |   43 +-
 pym/portage/dbapi/vartree.py                       |  315 ++++++---
 pym/portage/mail.py                                |   11 +-
 pym/portage/output.py                              |    4 +-
 pym/portage/package/ebuild/doebuild.py             |  121 ++--
 pym/portage/package/ebuild/getmaskingstatus.py     |    7 +-
 pym/portage/tests/ebuild/test_config.py            |    4 +-
 pym/portage/tests/locks/test_asynchronous_lock.py  |   95 +++-
 pym/portage/tests/resolver/ResolverPlayground.py   |   99 +++-
 pym/portage/tests/resolver/test_autounmask.py      |   51 ++-
 .../tests/resolver/test_circular_dependencies.py   |    3 +-
 pym/portage/tests/resolver/test_depth.py           |    8 +-
 pym/portage/tests/resolver/test_merge_order.py     |  386 +++++++++++
 pym/portage/tests/resolver/test_multirepo.py       |    3 +
 .../tests/resolver/test_old_dep_chain_display.py   |    2 +
 pym/portage/tests/resolver/test_simple.py          |    2 +-
 pym/portage/tests/resolver/test_slot_collisions.py |    3 +-
 pym/portage/update.py                              |    4 +-
 pym/portage/util/__init__.py                       |   29 +-
 pym/portage/util/_dyn_libs/LinkageMapELF.py        |   13 +-
 pym/portage/util/_dyn_libs/LinkageMapMachO.py      |   13 +-
 pym/portage/util/_dyn_libs/LinkageMapPeCoff.py     |   11 +-
 pym/portage/util/_dyn_libs/LinkageMapXCoff.py      |   11 +-
 pym/portage/util/digraph.py                        |   10 +-
 pym/portage/util/movefile.py                       |    5 +-
 pym/portage/xml/metadata.py                        |    4 +-
 63 files changed, 1948 insertions(+), 1302 deletions(-)

diff --cc bin/etc-update
index 5fbd345,2369f04..2054389
--- a/bin/etc-update
+++ b/bin/etc-update
@@@ -1,5 -1,5 +1,5 @@@
 -#!/bin/bash
 +#!@PORTAGE_BASH@
- # Copyright 1999-2007 Gentoo Foundation
+ # Copyright 1999-2011 Gentoo Foundation
  # Distributed under the terms of the GNU General Public License v2
 =20
  # Author Brandon Low <lostlogic@gentoo.org>
diff --cc bin/glsa-check
index 64209ab,2f2d555..4f50a1f
--- a/bin/glsa-check
+++ b/bin/glsa-check
@@@ -1,5 -1,5 +1,5 @@@
 -#!/usr/bin/python
 +#!@PREFIX_PORTAGE_PYTHON@
- # Copyright 2008-2009 Gentoo Foundation
+ # Copyright 2008-2011 Gentoo Foundation
  # Distributed under the terms of the GNU General Public License v2
 =20
  from __future__ import print_function
diff --cc bin/regenworld
index e0e9774,6b5af4c..9e0e291
--- a/bin/regenworld
+++ b/bin/regenworld
@@@ -1,5 -1,5 +1,5 @@@
 -#!/usr/bin/python
 +#!@PREFIX_PORTAGE_PYTHON@
- # Copyright 1999-2010 Gentoo Foundation
+ # Copyright 1999-2011 Gentoo Foundation
  # Distributed under the terms of the GNU General Public License v2
 =20
  from __future__ import print_function
diff --cc pym/portage/const.py
index 6057520,e91c009..00a53e4
--- a/pym/portage/const.py
+++ b/pym/portage/const.py
@@@ -132,10 -88,9 +132,10 @@@ EBUILD_PHASES            =3D ("pretend",=20
  SUPPORTED_FEATURES       =3D frozenset([
                             "assume-digests", "binpkg-logs", "buildpkg",=
 "buildsyspkg", "candy",
                             "ccache", "chflags", "collision-protect", "c=
ompress-build-logs",
-                            "digest", "distcc", "distlocks", "ebuild-loc=
ks", "fakeroot",
+                            "digest", "distcc", "distcc-pump", "distlock=
s", "ebuild-locks", "fakeroot",
                             "fail-clean", "fixpackages", "force-mirror",=
 "getbinpkg",
                             "installsources", "keeptemp", "keepwork", "f=
ixlafiles", "lmirror",
 +                            "macossandbox", "macosprefixsandbox", "maco=
susersandbox",
                             "metadata-transfer", "mirror", "multilib-str=
ict", "news",
                             "noauto", "noclean", "nodoc", "noinfo", "nom=
an",
                             "nostrip", "notitles", "parallel-fetch", "pa=
rallel-install",
diff --cc pym/portage/dbapi/vartree.py
index 581300f,e742358..e0f0856
--- a/pym/portage/dbapi/vartree.py
+++ b/pym/portage/dbapi/vartree.py
@@@ -2347,7 -2386,7 +2407,7 @@@ class dblink(object)
  		def path_to_node(path):
  			node =3D path_node_map.get(path)
  			if node is None:
- 				node =3D linkmap._LibGraphNode(path, root)
 -				node =3D LinkageMap._LibGraphNode(linkmap._obj_key(path))
++				node =3D linkmap._LibGraphNode(linkmap._obj_key(path))
  				alt_path_node =3D lib_graph.get(node)
  				if alt_path_node is not None:
  					node =3D alt_path_node
@@@ -2512,15 -2552,7 +2573,15 @@@
  		def path_to_node(path):
  			node =3D path_node_map.get(path)
  			if node is None:
 -				node =3D LinkageMap._LibGraphNode(linkmap._obj_key(path))
 +				chost =3D self.settings.get('CHOST')
 +				if chost.find('darwin') >=3D 0:
- 					node =3D LinkageMapMachO._LibGraphNode(path, root)
++					node =3D LinkageMapMachO._LibGraphNode(linkmap._obj_key(path))
 +				elif chost.find('interix') >=3D 0 or chost.find('winnt') >=3D 0:
- 					node =3D LinkageMapPeCoff._LibGraphNode(path, root)
++					node =3D LinkageMapPeCoff._LibGraphNode(linkmap._obj_key(path))
 +				elif chost.find('aix') >=3D 0:
- 					node =3D LinkageMapXCoff._LibGraphNode(path, root)
++					node =3D LinkageMapXCoff._LibGraphNode(linkmap._obj_key(path))
 +				else:
- 					node =3D LinkageMap._LibGraphNode(path, root)
++					node =3D LinkageMap._LibGraphNode(linkmap._obj_key(path))
  				alt_path_node =3D lib_graph.get(node)
  				if alt_path_node is not None:
  					node =3D alt_path_node
diff --cc pym/portage/util/_dyn_libs/LinkageMapMachO.py
index cbdf6c2,fef75b6..7ed004a
--- a/pym/portage/util/_dyn_libs/LinkageMapMachO.py
+++ b/pym/portage/util/_dyn_libs/LinkageMapMachO.py
@@@ -59,7 -60,7 +59,7 @@@ class LinkageMapMachO(object)
 =20
  		"""Helper class used as _obj_properties keys for objects."""
 =20
- 		__slots__ =3D ("__weakref__", "_key")
 -		__slots__ =3D ("_key",)
++		__slots__ =3D ("_key")
 =20
  		def __init__(self, obj, root):
  			"""
diff --cc pym/portage/util/_dyn_libs/LinkageMapPeCoff.py
index c90947e,0000000..25e8a45
mode 100644,000000..100644
--- a/pym/portage/util/_dyn_libs/LinkageMapPeCoff.py
+++ b/pym/portage/util/_dyn_libs/LinkageMapPeCoff.py
@@@ -1,267 -1,0 +1,274 @@@
 +# Copyright 1998-2011 Gentoo Foundation
 +# Distributed under the terms of the GNU General Public License v2
 +
 +import errno
 +import logging
 +import subprocess
 +
 +import portage
 +from portage import _encodings
 +from portage import _os_merge
 +from portage import _unicode_decode
 +from portage import _unicode_encode
 +from portage.cache.mappings import slot_dict_class
 +from portage.exception import CommandNotFound
 +from portage.localization import _
 +from portage.util import getlibpaths
 +from portage.util import grabfile
 +from portage.util import normalize_path
 +from portage.util import writemsg_level
 +from portage.const import EPREFIX
 +from portage.util._dyn_libs.LinkageMapELF import LinkageMapELF
 +
 +class LinkageMapPeCoff(LinkageMapELF):
 +
 +	"""Models dynamic linker dependencies."""
 +
 +	# NEEDED.PECOFF.1 has effectively the _same_ format as NEEDED.ELF.2,
 +	# but we keep up the relation "scanelf" -> "NEEDED.ELF", "readpecoff" =
->
 +	# "NEEDED.PECOFF", "scanmacho" -> "NEEDED.MACHO", etc. others will fol=
low.
 +	_needed_aux_key =3D "NEEDED.PECOFF.1"
 +
 +	class _ObjectKey(LinkageMapELF._ObjectKey):
 +
 +		"""Helper class used as _obj_properties keys for objects."""
 +
 +		def _generate_object_key(self, obj, root):
 +			"""
 +			Generate object key for a given object. This is different from the
 +			Linux implementation, since some systems (e.g. interix) don't have
 +			"inodes", thus the inode field is always zero, or a random value,
 +			making it inappropriate for identifying a file... :)
 +
 +			@param object: path to a file
 +			@type object: string (example: '/usr/bin/bar')
 +			@rtype: 2-tuple of types (bool, string)
 +			@return:
 +				2-tuple of boolean indicating existance, and absolut path
 +			"""
 +
 +			os =3D _os_merge
 +
 +			try:
 +				_unicode_encode(obj,
 +					encoding=3D_encodings['merge'], errors=3D'strict')
 +			except UnicodeEncodeError:
 +				# The package appears to have been merged with a=20
 +				# different value of sys.getfilesystemencoding(),
 +				# so fall back to utf_8 if appropriate.
 +				try:
 +					_unicode_encode(obj,
 +						encoding=3D_encodings['fs'], errors=3D'strict')
 +				except UnicodeEncodeError:
 +					pass
 +				else:
 +					os =3D portage.os
 +
 +			abs_path =3D os.path.join(root, obj.lstrip(os.sep))
 +			try:
 +				object_stat =3D os.stat(abs_path)
 +			except OSError:
 +				return (False, os.path.realpath(abs_path))
 +			# On Interix, the inode field may always be zero, since the
 +			# filesystem (NTFS) has no inodes ...
 +			return (True, os.path.realpath(abs_path))
 +
 +		def file_exists(self):
 +			"""
 +			Determine if the file for this key exists on the filesystem.
 +
 +			@rtype: Boolean
 +			@return:
 +				1. True if the file exists.
 +				2. False if the file does not exist or is a broken symlink.
 +
 +			"""
 +			return self._key[0]
 +
 +	class _LibGraphNode(_ObjectKey):
 +		__slots__ =3D ("alt_paths",)
 +
- 		def __init__(self, obj, root):
- 			LinkageMapPeCoff._ObjectKey.__init__(self, obj, root)
++		def __init__(self, key):
++			"""
++			Create a _LibGraphNode from an existing _ObjectKey.
++			This re-uses the _key attribute in order to avoid repeating
++			any previous stat calls, which helps to avoid potential race
++			conditions due to inconsistent stat results when the
++			file system is being modified concurrently.
++			"""
++			self._key =3D key._key
 +			self.alt_paths =3D set()
 +
 +		def __str__(self):
 +			return str(sorted(self.alt_paths))
 +
 +	def rebuild(self, exclude_pkgs=3DNone, include_file=3DNone,
 +		preserve_paths=3DNone):
 +		"""
 +		Raises CommandNotFound if there are preserved libs
 +		and the readpecoff binary is not available.
 +
 +		@param exclude_pkgs: A set of packages that should be excluded from
 +			the LinkageMap, since they are being unmerged and their NEEDED
 +			entries are therefore irrelevant and would only serve to corrupt
 +			the LinkageMap.
 +		@type exclude_pkgs: set
 +		@param include_file: The path of a file containing NEEDED entries for
 +			a package which does not exist in the vardbapi yet because it is
 +			currently being merged.
 +		@type include_file: String
 +		@param preserve_paths: Libraries preserved by a package instance that
 +			is currently being merged. They need to be explicitly passed to the
 +			LinkageMap, since they are not registered in the
 +			PreservedLibsRegistry yet.
 +		@type preserve_paths: set
 +		"""
 +
 +		os =3D _os_merge
 +		root =3D self._root
 +		root_len =3D len(root) - 1
 +		self._clear_cache()
 +		self._defpath.update(getlibpaths(self._root))
 +		libs =3D self._libs
 +		obj_properties =3D self._obj_properties
 +
 +		lines =3D []
 +
 +		# Data from include_file is processed first so that it
 +		# overrides any data from previously installed files.
 +		if include_file is not None:
 +			for line in grabfile(include_file):
 +				lines.append((include_file, line))
 +
 +		aux_keys =3D [self._needed_aux_key]
 +		can_lock =3D os.access(os.path.dirname(self._dbapi._dbroot), os.W_OK)
 +		if can_lock:
 +			self._dbapi.lock()
 +		try:
 +			for cpv in self._dbapi.cpv_all():
 +				if exclude_pkgs is not None and cpv in exclude_pkgs:
 +					continue
 +				needed_file =3D self._dbapi.getpath(cpv,
 +					filename=3Dself._needed_aux_key)
 +				for line in self._dbapi.aux_get(cpv, aux_keys)[0].splitlines():
 +					lines.append((needed_file, line))
 +		finally:
 +			if can_lock:
 +				self._dbapi.unlock()
 +
 +		# have to call readpecoff for preserved libs here as they aren't=20
 +		# registered in NEEDED.PECOFF.1 files
 +		plibs =3D set()
 +		if preserve_paths is not None:
 +			plibs.update(preserve_paths)
 +		if self._dbapi._plib_registry and \
 +			self._dbapi._plib_registry.hasEntries():
 +			for cpv, items in \
 +				self._dbapi._plib_registry.getPreservedLibs().items():
 +				if exclude_pkgs is not None and cpv in exclude_pkgs:
 +					# These preserved libs will either be unmerged,
 +					# rendering them irrelevant, or they will be
 +					# preserved in the replacement package and are
 +					# already represented via the preserve_paths
 +					# parameter.
 +					continue
 +				plibs.update(items)
 +		if plibs:
 +			args =3D ["readpecoff", self._dbapi.settings.get('CHOST')]
 +			args.extend(os.path.join(root, x.lstrip("." + os.sep)) \
 +				for x in plibs)
 +			try:
 +				proc =3D subprocess.Popen(args, stdout=3Dsubprocess.PIPE)
 +			except EnvironmentError as e:
 +				if e.errno !=3D errno.ENOENT:
 +					raise
 +				raise CommandNotFound(args[0])
 +			else:
 +				for l in proc.stdout:
 +					try:
 +						l =3D _unicode_decode(l,
 +							encoding=3D_encodings['content'], errors=3D'strict')
 +					except UnicodeDecodeError:
 +						l =3D _unicode_decode(l,
 +							encoding=3D_encodings['content'], errors=3D'replace')
 +						writemsg_level(_("\nError decoding characters " \
 +							"returned from readpecoff: %s\n\n") % (l,),
 +							level=3Dlogging.ERROR, noiselevel=3D-1)
 +					l =3D l[3:].rstrip("\n")
 +					if not l:
 +						continue
 +					fields =3D l.split(";")
 +					if len(fields) < 5:
 +						writemsg_level(_("\nWrong number of fields " \
 +							"returned from readpecoff: %s\n\n") % (l,),
 +							level=3Dlogging.ERROR, noiselevel=3D-1)
 +						continue
 +					fields[1] =3D fields[1][root_len:]
 +					plibs.discard(fields[1])
 +					lines.append(("readpecoff", ";".join(fields)))
 +				proc.wait()
 +
 +		if plibs:
 +			# Preserved libraries that did not appear in the scanelf output.
 +			# This is known to happen with statically linked libraries.
 +			# Generate dummy lines for these, so we can assume that every
 +			# preserved library has an entry in self._obj_properties. This
 +			# is important in order to prevent findConsumers from raising
 +			# an unwanted KeyError.
 +			for x in plibs:
 +				lines.append(("plibs", ";".join(['', x, '', '', ''])))
 +
 +		for location, l in lines:
 +			l =3D l.rstrip("\n")
 +			if not l:
 +				continue
 +			fields =3D l.split(";")
 +			if len(fields) < 5:
 +				writemsg_level(_("\nWrong number of fields " \
 +					"in %s: %s\n\n") % (location, l),
 +					level=3Dlogging.ERROR, noiselevel=3D-1)
 +				continue
 +			arch =3D fields[0]
 +			obj =3D fields[1]
 +			soname =3D fields[2]
 +			path =3D set([normalize_path(x) \
 +				for x in filter(None, fields[3].replace(
 +				"${ORIGIN}", os.path.dirname(obj)).replace(
 +				"$ORIGIN", os.path.dirname(obj)).split(":"))])
 +			needed =3D [x for x in fields[4].split(",") if x]
 +
 +			obj_key =3D self._obj_key(obj)
 +			indexed =3D True
 +			myprops =3D obj_properties.get(obj_key)
 +			if myprops is None:
 +				indexed =3D False
 +				myprops =3D (arch, needed, path, soname, set())
 +				obj_properties[obj_key] =3D myprops
 +			# All object paths are added into the obj_properties tuple.
 +			myprops[4].add(obj)
 +
 +			# Don't index the same file more that once since only one
 +			# set of data can be correct and therefore mixing data
 +			# may corrupt the index (include_file overrides previously
 +			# installed).
 +			if indexed:
 +				continue
 +
 +			arch_map =3D libs.get(arch)
 +			if arch_map is None:
 +				arch_map =3D {}
 +				libs[arch] =3D arch_map
 +			if soname:
 +				soname_map =3D arch_map.get(soname)
 +				if soname_map is None:
 +					soname_map =3D self._soname_map_class(
 +						providers=3Dset(), consumers=3Dset())
 +					arch_map[soname] =3D soname_map
 +				soname_map.providers.add(obj_key)
 +			for needed_soname in needed:
 +				soname_map =3D arch_map.get(needed_soname)
 +				if soname_map is None:
 +					soname_map =3D self._soname_map_class(
 +						providers=3Dset(), consumers=3Dset())
 +					arch_map[needed_soname] =3D soname_map
 +				soname_map.consumers.add(obj_key)
diff --cc pym/portage/util/_dyn_libs/LinkageMapXCoff.py
index 0e930fe,0000000..782cc54
mode 100644,000000..100644
--- a/pym/portage/util/_dyn_libs/LinkageMapXCoff.py
+++ b/pym/portage/util/_dyn_libs/LinkageMapXCoff.py
@@@ -1,319 -1,0 +1,326 @@@
 +# Copyright 1998-2011 Gentoo Foundation
 +# Distributed under the terms of the GNU General Public License v2
 +
 +import errno
 +import logging
 +import subprocess
 +
 +import portage
 +from portage import _encodings
 +from portage import _os_merge
 +from portage import _unicode_decode
 +from portage import _unicode_encode
 +from portage.cache.mappings import slot_dict_class
 +from portage.exception import CommandNotFound
 +from portage.localization import _
 +from portage.util import getlibpaths
 +from portage.util import grabfile
 +from portage.util import normalize_path
 +from portage.util import writemsg_level
 +from portage.const import EPREFIX, BASH_BINARY
 +from portage.util._dyn_libs.LinkageMapELF import LinkageMapELF
 +
 +class LinkageMapXCoff(LinkageMapELF):
 +
 +	"""Models dynamic linker dependencies."""
 +
 +	_needed_aux_key =3D "NEEDED.XCOFF.1"
 +
 +	class _ObjectKey(LinkageMapELF._ObjectKey):
 +
 +		def __init__(self, obj, root):
 +			LinkageMapELF._ObjectKey.__init__(self, obj, root)
 +
 +		def _generate_object_key(self, obj, root):
 +			"""
 +			Generate object key for a given object.
 +
 +			@param object: path to a file
 +			@type object: string (example: '/usr/bin/bar')
 +			@rtype: 2-tuple of types (long, int) if object exists. string if
 +				object does not exist.
 +			@return:
 +				1. 2-tuple of object's inode and device from a stat call, if object
 +					exists.
 +				2. realpath of object if object does not exist.
 +
 +			"""
 +
 +			os =3D _os_merge
 +
 +			try:
 +				_unicode_encode(obj,
 +					encoding=3D_encodings['merge'], errors=3D'strict')
 +			except UnicodeEncodeError:
 +				# The package appears to have been merged with a=20
 +				# different value of sys.getfilesystemencoding(),
 +				# so fall back to utf_8 if appropriate.
 +				try:
 +					_unicode_encode(obj,
 +						encoding=3D_encodings['fs'], errors=3D'strict')
 +				except UnicodeEncodeError:
 +					pass
 +				else:
 +					os =3D portage.os
 +
 +			abs_path =3D os.path.join(root, obj.lstrip(os.sep))
 +			try:
 +				object_stat =3D os.stat(abs_path)
 +			except OSError:
 +				# Use the realpath as the key if the file does not exists on the
 +				# filesystem.
 +				return os.path.realpath(abs_path)
 +			# Return a tuple of the device and inode, as well as the basename,
 +			# because of hardlinks the device and inode might be identical.
 +			return (object_stat.st_dev, object_stat.st_ino, os.path.basename(abs=
_path.rstrip(os.sep)))
 +
 +		def file_exists(self):
 +			"""
 +			Determine if the file for this key exists on the filesystem.
 +
 +			@rtype: Boolean
 +			@return:
 +				1. True if the file exists.
 +				2. False if the file does not exist or is a broken symlink.
 +
 +			"""
 +			return isinstance(self._key, tuple)
 +
 +	class _LibGraphNode(_ObjectKey):
 +		__slots__ =3D ("alt_paths",)
 +
- 		def __init__(self, obj, root):
- 			LinkageMapXCoff._ObjectKey.__init__(self, obj, root)
++		def __init__(self, key):
++			"""
++			Create a _LibGraphNode from an existing _ObjectKey.
++			This re-uses the _key attribute in order to avoid repeating
++			any previous stat calls, which helps to avoid potential race
++			conditions due to inconsistent stat results when the
++			file system is being modified concurrently.
++			"""
++			self._key =3D key._key
 +			self.alt_paths =3D set()
 +
 +		def __str__(self):
 +			return str(sorted(self.alt_paths))
 +
 +	def rebuild(self, exclude_pkgs=3DNone, include_file=3DNone,
 +		preserve_paths=3DNone):
 +		"""
 +		Raises CommandNotFound if there are preserved libs
 +		and the scanelf binary is not available.
 +
 +		@param exclude_pkgs: A set of packages that should be excluded from
 +			the LinkageMap, since they are being unmerged and their NEEDED
 +			entries are therefore irrelevant and would only serve to corrupt
 +			the LinkageMap.
 +		@type exclude_pkgs: set
 +		@param include_file: The path of a file containing NEEDED entries for
 +			a package which does not exist in the vardbapi yet because it is
 +			currently being merged.
 +		@type include_file: String
 +		@param preserve_paths: Libraries preserved by a package instance that
 +			is currently being merged. They need to be explicitly passed to the
 +			LinkageMap, since they are not registered in the
 +			PreservedLibsRegistry yet.
 +		@type preserve_paths: set
 +		"""
 +
 +		os =3D _os_merge
 +		root =3D self._root
 +		root_len =3D len(root) - 1
 +		self._clear_cache()
 +		self._defpath.update(getlibpaths(self._root))
 +		libs =3D self._libs
 +		obj_properties =3D self._obj_properties
 +
 +		lines =3D []
 +
 +		# Data from include_file is processed first so that it
 +		# overrides any data from previously installed files.
 +		if include_file is not None:
 +			for line in grabfile(include_file):
 +				lines.append((include_file, line))
 +
 +		aux_keys =3D [self._needed_aux_key]
 +		can_lock =3D os.access(os.path.dirname(self._dbapi._dbroot), os.W_OK)
 +		if can_lock:
 +			self._dbapi.lock()
 +		try:
 +			for cpv in self._dbapi.cpv_all():
 +				if exclude_pkgs is not None and cpv in exclude_pkgs:
 +					continue
 +				needed_file =3D self._dbapi.getpath(cpv,
 +					filename=3Dself._needed_aux_key)
 +				for line in self._dbapi.aux_get(cpv, aux_keys)[0].splitlines():
 +					lines.append((needed_file, line))
 +		finally:
 +			if can_lock:
 +				self._dbapi.unlock()
 +
 +		# have to call scanelf for preserved libs here as they aren't=20
 +		# registered in NEEDED.XCOFF.1 files
 +		plibs =3D set()
 +		if preserve_paths is not None:
 +			plibs.update(preserve_paths)
 +		if self._dbapi._plib_registry and \
 +			self._dbapi._plib_registry.hasEntries():
 +			for cpv, items in \
 +				self._dbapi._plib_registry.getPreservedLibs().items():
 +				if exclude_pkgs is not None and cpv in exclude_pkgs:
 +					# These preserved libs will either be unmerged,
 +					# rendering them irrelevant, or they will be
 +					# preserved in the replacement package and are
 +					# already represented via the preserve_paths
 +					# parameter.
 +					continue
 +				plibs.update(items)
 +		if plibs:
 +			for x in plibs:
 +				args =3D [BASH_BINARY, "-c", ':'
 +					+ '; member=3D"' + x + '"'
 +					+ '; archive=3D${member}'
 +					+ '; if [[ ${member##*/} =3D=3D .*"["*"]" ]]'
 +					+ '; then member=3D${member%/.*}/${member##*/.}'
 +						 + '; archive=3D${member%[*}'
 +					+ '; fi'
 +					+ '; member=3D${member#${archive}}'
 +					+ '; [[ -r ${archive} ]] || chmod a+r "${archive}"'
 +					+ '; eval $(aixdll-query "${archive}${member}" FILE MEMBER FLAGS F=
ORMAT RUNPATH DEPLIBS)'
 +					+ '; [[ -n ${member} ]] && needed=3D${FILE##*/} || needed=3D'
 +					+ '; for deplib in ${DEPLIBS}'
 +					+ '; do eval deplib=3D${deplib}'
 +					   + '; if [[ ${deplib} !=3D "." && ${deplib} !=3D ".." ]]'
 +					   + '; then needed=3D"${needed}${needed:+,}${deplib}"'
 +					   + '; fi'
 +					+ '; done'
 +					+ '; [[ -n ${MEMBER} ]] && MEMBER=3D"[${MEMBER}]"'
 +					+ '; [[ " ${FLAGS} " =3D=3D *" SHROBJ "* ]] && soname=3D${FILE##*/=
}${MEMBER} || soname=3D'
 +					+ '; echo "${FORMAT##* }${FORMAT%%-*};${FILE#${ROOT%/}}${MEMBER};$=
{soname};${RUNPATH};${needed}"'
 +					+ '; [[ -z ${member} && -n ${MEMBER} ]] && echo "${FORMAT##* }${FO=
RMAT%%-*};${FILE#${ROOT%/}};${FILE##*/};;"'
 +				]
 +			try:
 +				proc =3D subprocess.Popen(args, stdout=3Dsubprocess.PIPE)
 +			except EnvironmentError as e:
 +				if e.errno !=3D errno.ENOENT:
 +					raise
 +				raise CommandNotFound(args[0])
 +			else:
 +				for l in proc.stdout:
 +					try:
 +						l =3D _unicode_decode(l,
 +							encoding=3D_encodings['content'], errors=3D'strict')
 +					except UnicodeDecodeError:
 +						l =3D _unicode_decode(l,
 +							encoding=3D_encodings['content'], errors=3D'replace')
 +						writemsg_level(_("\nError decoding characters " \
 +							"returned from aixdll-query: %s\n\n") % (l,),
 +							level=3Dlogging.ERROR, noiselevel=3D-1)
 +					l =3D l.rstrip("\n")
 +					if not l:
 +						continue
 +					fields =3D l.split(";")
 +					if len(fields) < 5:
 +						writemsg_level(_("\nWrong number of fields " \
 +							"returned from aixdll-query: %s\n\n") % (l,),
 +							level=3Dlogging.ERROR, noiselevel=3D-1)
 +						continue
 +					fields[1] =3D fields[1][root_len:]
 +					plibs.discard(fields[1])
 +					lines.append(("aixdll-query", ";".join(fields)))
 +				proc.wait()
 +
 +		if plibs:
 +			# Preserved libraries that did not appear in the bash
 +			# aixdll-query code output.  This is known to happen with
 +			# statically linked libraries.  Generate dummy lines for
 +			# these, so we can assume that every preserved library has
 +			# an entry in self._obj_properties.  This is important in
 +			# order to prevent findConsumers from raising an unwanted
 +			# KeyError.
 +			for x in plibs:
 +				lines.append(("plibs", ";".join(['', x, '', '', ''])))
 +
 +		for location, l in lines:
 +			l =3D l.rstrip("\n")
 +			if not l:
 +				continue
 +			fields =3D l.split(";")
 +			if len(fields) < 5:
 +				writemsg_level(_("\nWrong number of fields " \
 +					"in %s: %s\n\n") % (location, l),
 +					level=3Dlogging.ERROR, noiselevel=3D-1)
 +				continue
 +			arch =3D fields[0]
 +
 +			def as_contentmember(obj):
 +				if obj.endswith("]"):
 +					if obj.find("/") >=3D 0:
 +						return obj[:obj.rfind("/")] + "/." + obj[obj.rfind("/")+1:]
 +					return "." + obj
 +				return obj
 +
 +			obj =3D as_contentmember(fields[1])
 +			soname =3D as_contentmember(fields[2])
 +			path =3D set([normalize_path(x) \
 +				for x in filter(None, fields[3].replace(
 +				"${ORIGIN}", os.path.dirname(obj)).replace(
 +				"$ORIGIN", os.path.dirname(obj)).split(":"))])
 +			needed =3D [as_contentmember(x) for x in fields[4].split(",") if x]
 +
 +			obj_key =3D self._obj_key(obj)
 +			indexed =3D True
 +			myprops =3D obj_properties.get(obj_key)
 +			if myprops is None:
 +				indexed =3D False
 +				myprops =3D (arch, needed, path, soname, set())
 +				obj_properties[obj_key] =3D myprops
 +			# All object paths are added into the obj_properties tuple.
 +			myprops[4].add(obj)
 +
 +			# Don't index the same file more that once since only one
 +			# set of data can be correct and therefore mixing data
 +			# may corrupt the index (include_file overrides previously
 +			# installed).
 +			if indexed:
 +				continue
 +
 +			arch_map =3D libs.get(arch)
 +			if arch_map is None:
 +				arch_map =3D {}
 +				libs[arch] =3D arch_map
 +			if soname:
 +				soname_map =3D arch_map.get(soname)
 +				if soname_map is None:
 +					soname_map =3D self._soname_map_class(
 +						providers=3Dset(), consumers=3Dset())
 +					arch_map[soname] =3D soname_map
 +				soname_map.providers.add(obj_key)
 +			for needed_soname in needed:
 +				soname_map =3D arch_map.get(needed_soname)
 +				if soname_map is None:
 +					soname_map =3D self._soname_map_class(
 +						providers=3Dset(), consumers=3Dset())
 +					arch_map[needed_soname] =3D soname_map
 +				soname_map.consumers.add(obj_key)
 +
 +	def getSoname(self, obj):
 +		"""
 +		Return the soname associated with an object.
 +
 +		@param obj: absolute path to an object
 +		@type obj: string (example: '/usr/bin/bar')
 +		@rtype: string
 +		@return: soname as a string
 +
 +		"""
 +		if not self._libs:
 +			self.rebuild()
 +		if isinstance(obj, self._ObjectKey):
 +			obj_key =3D obj
 +			if obj_key not in self._obj_properties:
 +				raise KeyError("%s not in object list" % obj_key)
 +			return self._obj_properties[obj_key][3]
 +		if obj not in self._obj_key_cache:
 +			raise KeyError("%s not in object list" % obj)
 +		return self._obj_properties[self._obj_key_cache[obj]][3]
 +