2
0
Эх сурвалжийг харах

Import upstream version 0.20.6, md5 e7f11a8e35db3fd16e848db2899c8ef9

Jelmer Vernooij 4 жил өмнө
parent
commit
9e6c9a77d6

+ 3 - 0
.github/workflows/pythonpackage.yml

@@ -20,6 +20,9 @@ jobs:
           # path encoding
           # path encoding
           - os: windows-latest
           - os: windows-latest
             python-version: 3.5
             python-version: 3.5
+          # path encoding
+          - os: macos-latest
+            python-version: 3.5
       fail-fast: false
       fail-fast: false
 
 
     steps:
     steps:

+ 76 - 0
CODE_OF_CONDUCT.md

@@ -0,0 +1,76 @@
+# Contributor Covenant Code of Conduct
+
+## Our Pledge
+
+In the interest of fostering an open and welcoming environment, we as
+contributors and maintainers pledge to making participation in our project and
+our community a harassment-free experience for everyone, regardless of age, body
+size, disability, ethnicity, sex characteristics, gender identity and expression,
+level of experience, education, socio-economic status, nationality, personal
+appearance, race, religion, or sexual identity and orientation.
+
+## Our Standards
+
+Examples of behavior that contributes to creating a positive environment
+include:
+
+* Using welcoming and inclusive language
+* Being respectful of differing viewpoints and experiences
+* Gracefully accepting constructive criticism
+* Focusing on what is best for the community
+* Showing empathy towards other community members
+
+Examples of unacceptable behavior by participants include:
+
+* The use of sexualized language or imagery and unwelcome sexual attention or
+ advances
+* Trolling, insulting/derogatory comments, and personal or political attacks
+* Public or private harassment
+* Publishing others' private information, such as a physical or electronic
+ address, without explicit permission
+* Other conduct which could reasonably be considered inappropriate in a
+ professional setting
+
+## Our Responsibilities
+
+Project maintainers are responsible for clarifying the standards of acceptable
+behavior and are expected to take appropriate and fair corrective action in
+response to any instances of unacceptable behavior.
+
+Project maintainers have the right and responsibility to remove, edit, or
+reject comments, commits, code, wiki edits, issues, and other contributions
+that are not aligned to this Code of Conduct, or to ban temporarily or
+permanently any contributor for other behaviors that they deem inappropriate,
+threatening, offensive, or harmful.
+
+## Scope
+
+This Code of Conduct applies both within project spaces and in public spaces
+when an individual is representing the project or its community. Examples of
+representing a project or community include using an official project e-mail
+address, posting via an official social media account, or acting as an appointed
+representative at an online or offline event. Representation of a project may be
+further defined and clarified by project maintainers.
+
+## Enforcement
+
+Instances of abusive, harassing, or otherwise unacceptable behavior may be
+reported by contacting the project team at team@dulwich.io. All
+complaints will be reviewed and investigated and will result in a response that
+is deemed necessary and appropriate to the circumstances. The project team is
+obligated to maintain confidentiality with regard to the reporter of an incident.
+Further details of specific enforcement policies may be posted separately.
+
+Project maintainers who do not follow or enforce the Code of Conduct in good
+faith may face temporary or permanent repercussions as determined by other
+members of the project's leadership.
+
+## Attribution
+
+This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4,
+available at https://www.contributor-covenant.org/version/1/4/code-of-conduct.html
+
+[homepage]: https://www.contributor-covenant.org
+
+For answers to common questions about this code of conduct, see
+https://www.contributor-covenant.org/faq

+ 14 - 0
NEWS

@@ -1,3 +1,17 @@
+0.20.6	2020-08-29
+
+ * Add a ``RefsContainer.watch`` interface.
+   (Jelmer Vernooij, #751)
+
+ * Fix pushing of new branches from porcelain.push.
+   (Jelmer Vernooij, #788)
+
+ * Honor shallows when pushing from a shallow clone.
+   (Jelmer Vernooij, #794)
+
+ * Fix porcelain.path_to_tree_path for Python 3.5.
+   (Boris Feld, #777)
+
 0.20.5	2020-06-22
 0.20.5	2020-06-22
 
 
  * Print a clearer exception when setup.py is executed on Python < 3.5.
  * Print a clearer exception when setup.py is executed on Python < 3.5.

+ 3 - 10
PKG-INFO

@@ -1,6 +1,6 @@
 Metadata-Version: 2.1
 Metadata-Version: 2.1
 Name: dulwich
 Name: dulwich
-Version: 0.20.5
+Version: 0.20.6
 Summary: Python Git Library
 Summary: Python Git Library
 Home-page: https://www.dulwich.io/
 Home-page: https://www.dulwich.io/
 Author: Jelmer Vernooij
 Author: Jelmer Vernooij
@@ -9,15 +9,7 @@ License: Apachev2 or later or GPLv2
 Project-URL: Bug Tracker, https://github.com/dulwich/dulwich/issues
 Project-URL: Bug Tracker, https://github.com/dulwich/dulwich/issues
 Project-URL: Repository, https://www.dulwich.io/code/
 Project-URL: Repository, https://www.dulwich.io/code/
 Project-URL: GitHub, https://github.com/dulwich/dulwich
 Project-URL: GitHub, https://github.com/dulwich/dulwich
-Description: .. image:: https://travis-ci.org/dulwich/dulwich.png?branch=master
-          :alt: Build Status
-          :target: https://travis-ci.org/dulwich/dulwich
-        
-        .. image:: https://ci.appveyor.com/api/projects/status/mob7g4vnrfvvoweb?svg=true
-          :alt: Windows Build Status
-          :target: https://ci.appveyor.com/project/jelmer/dulwich/branch/master
-        
-        This is the Dulwich project.
+Description: This is the Dulwich project.
         
         
         It aims to provide an interface to git repos (both local and remote) that
         It aims to provide an interface to git repos (both local and remote) that
         doesn't call out to git directly but instead uses pure Python.
         doesn't call out to git directly but instead uses pure Python.
@@ -127,3 +119,4 @@ Requires-Python: >=3.5
 Provides-Extra: fastimport
 Provides-Extra: fastimport
 Provides-Extra: https
 Provides-Extra: https
 Provides-Extra: pgp
 Provides-Extra: pgp
+Provides-Extra: watch

+ 0 - 8
README.rst

@@ -1,11 +1,3 @@
-.. image:: https://travis-ci.org/dulwich/dulwich.png?branch=master
-  :alt: Build Status
-  :target: https://travis-ci.org/dulwich/dulwich
-
-.. image:: https://ci.appveyor.com/api/projects/status/mob7g4vnrfvvoweb?svg=true
-  :alt: Windows Build Status
-  :target: https://ci.appveyor.com/project/jelmer/dulwich/branch/master
-
 This is the Dulwich project.
 This is the Dulwich project.
 
 
 It aims to provide an interface to git repos (both local and remote) that
 It aims to provide an interface to git repos (both local and remote) that

+ 3 - 10
dulwich.egg-info/PKG-INFO

@@ -1,6 +1,6 @@
 Metadata-Version: 2.1
 Metadata-Version: 2.1
 Name: dulwich
 Name: dulwich
-Version: 0.20.5
+Version: 0.20.6
 Summary: Python Git Library
 Summary: Python Git Library
 Home-page: https://www.dulwich.io/
 Home-page: https://www.dulwich.io/
 Author: Jelmer Vernooij
 Author: Jelmer Vernooij
@@ -9,15 +9,7 @@ License: Apachev2 or later or GPLv2
 Project-URL: Bug Tracker, https://github.com/dulwich/dulwich/issues
 Project-URL: Bug Tracker, https://github.com/dulwich/dulwich/issues
 Project-URL: Repository, https://www.dulwich.io/code/
 Project-URL: Repository, https://www.dulwich.io/code/
 Project-URL: GitHub, https://github.com/dulwich/dulwich
 Project-URL: GitHub, https://github.com/dulwich/dulwich
-Description: .. image:: https://travis-ci.org/dulwich/dulwich.png?branch=master
-          :alt: Build Status
-          :target: https://travis-ci.org/dulwich/dulwich
-        
-        .. image:: https://ci.appveyor.com/api/projects/status/mob7g4vnrfvvoweb?svg=true
-          :alt: Windows Build Status
-          :target: https://ci.appveyor.com/project/jelmer/dulwich/branch/master
-        
-        This is the Dulwich project.
+Description: This is the Dulwich project.
         
         
         It aims to provide an interface to git repos (both local and remote) that
         It aims to provide an interface to git repos (both local and remote) that
         doesn't call out to git directly but instead uses pure Python.
         doesn't call out to git directly but instead uses pure Python.
@@ -127,3 +119,4 @@ Requires-Python: >=3.5
 Provides-Extra: fastimport
 Provides-Extra: fastimport
 Provides-Extra: https
 Provides-Extra: https
 Provides-Extra: pgp
 Provides-Extra: pgp
+Provides-Extra: watch

+ 2 - 0
dulwich.egg-info/SOURCES.txt

@@ -3,6 +3,7 @@
 .mailmap
 .mailmap
 .testr.conf
 .testr.conf
 AUTHORS
 AUTHORS
+CODE_OF_CONDUCT.md
 CONTRIBUTING.rst
 CONTRIBUTING.rst
 COPYING
 COPYING
 MANIFEST.in
 MANIFEST.in
@@ -16,6 +17,7 @@ dulwich.cfg
 requirements.txt
 requirements.txt
 setup.cfg
 setup.cfg
 setup.py
 setup.py
+status.yaml
 tox.ini
 tox.ini
 .github/workflows/pythonpackage.yml
 .github/workflows/pythonpackage.yml
 .github/workflows/pythonpublish.yml
 .github/workflows/pythonpublish.yml

+ 3 - 0
dulwich.egg-info/requires.txt

@@ -9,3 +9,6 @@ urllib3[secure]>=1.24.1
 
 
 [pgp]
 [pgp]
 gpg
 gpg
+
+[watch]
+pyinotify

+ 1 - 1
dulwich/__init__.py

@@ -22,4 +22,4 @@
 
 
 """Python implementation of the Git file formats and protocols."""
 """Python implementation of the Git file formats and protocols."""
 
 
-__version__ = (0, 20, 5)
+__version__ = (0, 20, 6)

+ 7 - 8
dulwich/client.py

@@ -773,14 +773,13 @@ def check_wants(wants, refs):
 
 
 def _remote_error_from_stderr(stderr):
 def _remote_error_from_stderr(stderr):
     if stderr is None:
     if stderr is None:
-        raise HangupException()
+        return HangupException()
     lines = [line.rstrip(b'\n') for line in stderr.readlines()]
     lines = [line.rstrip(b'\n') for line in stderr.readlines()]
     for line in lines:
     for line in lines:
         if line.startswith(b'ERROR: '):
         if line.startswith(b'ERROR: '):
-            raise GitProtocolError(
+            return GitProtocolError(
                 line[len(b'ERROR: '):].decode('utf-8', 'replace'))
                 line[len(b'ERROR: '):].decode('utf-8', 'replace'))
-        raise GitProtocolError(line.decode('utf-8', 'replace'))
-    raise HangupException(lines)
+    return HangupException(lines)
 
 
 
 
 class TraditionalGitClient(GitClient):
 class TraditionalGitClient(GitClient):
@@ -832,7 +831,7 @@ class TraditionalGitClient(GitClient):
             try:
             try:
                 old_refs, server_capabilities = read_pkt_refs(proto)
                 old_refs, server_capabilities = read_pkt_refs(proto)
             except HangupException:
             except HangupException:
-                _remote_error_from_stderr(stderr)
+                raise _remote_error_from_stderr(stderr)
             negotiated_capabilities, agent = \
             negotiated_capabilities, agent = \
                 self._negotiate_receive_pack_capabilities(server_capabilities)
                 self._negotiate_receive_pack_capabilities(server_capabilities)
             if CAPABILITY_REPORT_STATUS in negotiated_capabilities:
             if CAPABILITY_REPORT_STATUS in negotiated_capabilities:
@@ -912,7 +911,7 @@ class TraditionalGitClient(GitClient):
             try:
             try:
                 refs, server_capabilities = read_pkt_refs(proto)
                 refs, server_capabilities = read_pkt_refs(proto)
             except HangupException:
             except HangupException:
-                _remote_error_from_stderr(stderr)
+                raise _remote_error_from_stderr(stderr)
             negotiated_capabilities, symrefs, agent = (
             negotiated_capabilities, symrefs, agent = (
                     self._negotiate_upload_pack_capabilities(
                     self._negotiate_upload_pack_capabilities(
                             server_capabilities))
                             server_capabilities))
@@ -949,7 +948,7 @@ class TraditionalGitClient(GitClient):
             try:
             try:
                 refs, _ = read_pkt_refs(proto)
                 refs, _ = read_pkt_refs(proto)
             except HangupException:
             except HangupException:
-                _remote_error_from_stderr(stderr)
+                raise _remote_error_from_stderr(stderr)
             proto.write_pkt_line(None)
             proto.write_pkt_line(None)
             return refs
             return refs
 
 
@@ -969,7 +968,7 @@ class TraditionalGitClient(GitClient):
             try:
             try:
                 pkt = proto.read_pkt_line()
                 pkt = proto.read_pkt_line()
             except HangupException:
             except HangupException:
-                _remote_error_from_stderr(stderr)
+                raise _remote_error_from_stderr(stderr)
             if pkt == b"NACK\n":
             if pkt == b"NACK\n":
                 return
                 return
             elif pkt == b"ACK\n":
             elif pkt == b"ACK\n":

+ 13 - 6
dulwich/config.py

@@ -29,6 +29,8 @@ TODO:
 import os
 import os
 import sys
 import sys
 
 
+from typing import BinaryIO, Tuple, Optional
+
 from collections import (
 from collections import (
     OrderedDict,
     OrderedDict,
     )
     )
@@ -380,12 +382,17 @@ class ConfigFile(ConfigDict):
     """A Git configuration file, like .git/config or ~/.gitconfig.
     """A Git configuration file, like .git/config or ~/.gitconfig.
     """
     """
 
 
+    def __init__(self, values=None, encoding=None):
+        super(ConfigFile, self).__init__(values=values, encoding=encoding)
+        self.path = None
+
     @classmethod
     @classmethod
-    def from_file(cls, f):
+    def from_file(cls, f: BinaryIO) -> 'ConfigFile':
         """Read configuration from a file-like object."""
         """Read configuration from a file-like object."""
         ret = cls()
         ret = cls()
-        section = None
+        section = None  # type: Optional[Tuple[bytes, ...]]
         setting = None
         setting = None
+        continuation = None
         for lineno, line in enumerate(f.readlines()):
         for lineno, line in enumerate(f.readlines()):
             line = line.lstrip()
             line = line.lstrip()
             if setting is None:
             if setting is None:
@@ -429,7 +436,7 @@ class ConfigFile(ConfigDict):
                     value = b"true"
                     value = b"true"
                 setting = setting.strip()
                 setting = setting.strip()
                 if not _check_variable_name(setting):
                 if not _check_variable_name(setting):
-                    raise ValueError("invalid variable name %s" % setting)
+                    raise ValueError("invalid variable name %r" % setting)
                 if value.endswith(b"\\\n"):
                 if value.endswith(b"\\\n"):
                     continuation = value[:-2]
                     continuation = value[:-2]
                 else:
                 else:
@@ -449,21 +456,21 @@ class ConfigFile(ConfigDict):
         return ret
         return ret
 
 
     @classmethod
     @classmethod
-    def from_path(cls, path):
+    def from_path(cls, path) -> 'ConfigFile':
         """Read configuration from a file on disk."""
         """Read configuration from a file on disk."""
         with GitFile(path, 'rb') as f:
         with GitFile(path, 'rb') as f:
             ret = cls.from_file(f)
             ret = cls.from_file(f)
             ret.path = path
             ret.path = path
             return ret
             return ret
 
 
-    def write_to_path(self, path=None):
+    def write_to_path(self, path=None) -> None:
         """Write configuration to a file on disk."""
         """Write configuration to a file on disk."""
         if path is None:
         if path is None:
             path = self.path
             path = self.path
         with GitFile(path, 'wb') as f:
         with GitFile(path, 'wb') as f:
             self.write_to_file(f)
             self.write_to_file(f)
 
 
-    def write_to_file(self, f):
+    def write_to_file(self, f: BinaryIO) -> None:
         """Write configuration to a file-like object."""
         """Write configuration to a file-like object."""
         for section, values in self._values.items():
         for section, values in self._values.items():
             try:
             try:

+ 8 - 0
dulwich/errors.py

@@ -117,6 +117,9 @@ class GitProtocolError(Exception):
     def __init__(self, *args, **kwargs):
     def __init__(self, *args, **kwargs):
         Exception.__init__(self, *args, **kwargs)
         Exception.__init__(self, *args, **kwargs)
 
 
+    def __eq__(self, other):
+        return isinstance(self, type(other)) and self.args == other.args
+
 
 
 class SendPackError(GitProtocolError):
 class SendPackError(GitProtocolError):
     """An error occurred during send_pack."""
     """An error occurred during send_pack."""
@@ -147,6 +150,11 @@ class HangupException(GitProtocolError):
                 "The remote server unexpectedly closed the connection.")
                 "The remote server unexpectedly closed the connection.")
         self.stderr_lines = stderr_lines
         self.stderr_lines = stderr_lines
 
 
+    def __eq__(self, other):
+        return (
+            isinstance(self, type(other)) and
+            self.stderr_lines == other.stderr_lines)
+
 
 
 class UnexpectedCommandError(GitProtocolError):
 class UnexpectedCommandError(GitProtocolError):
     """Unexpected command received in a proto line."""
     """Unexpected command received in a proto line."""

+ 14 - 20
dulwich/graph.py

@@ -82,18 +82,15 @@ def _find_lcas(lookup_parents, c1, c2s):
     return results
     return results
 
 
 
 
-def find_merge_base(object_store, commit_ids):
+def find_merge_base(repo, commit_ids):
     """Find lowest common ancestors of commit_ids[0] and *any* of commits_ids[1:]
     """Find lowest common ancestors of commit_ids[0] and *any* of commits_ids[1:]
 
 
     Args:
     Args:
-      object_store: object store
-      commit_ids:  list of commit ids
+      repo: Repository object
+      commit_ids: list of commit ids
     Returns:
     Returns:
       list of lowest common ancestor commit_ids
       list of lowest common ancestor commit_ids
     """
     """
-    def lookup_parents(commit_id):
-        return object_store[commit_id].parents
-
     if not commit_ids:
     if not commit_ids:
         return []
         return []
     c1 = commit_ids[0]
     c1 = commit_ids[0]
@@ -102,51 +99,48 @@ def find_merge_base(object_store, commit_ids):
     c2s = commit_ids[1:]
     c2s = commit_ids[1:]
     if c1 in c2s:
     if c1 in c2s:
         return [c1]
         return [c1]
-    return _find_lcas(lookup_parents, c1, c2s)
+    parents_provider = repo.parents_provider()
+    return _find_lcas(parents_provider.get_parents, c1, c2s)
 
 
 
 
-def find_octopus_base(object_store, commit_ids):
+def find_octopus_base(repo, commit_ids):
     """Find lowest common ancestors of *all* provided commit_ids
     """Find lowest common ancestors of *all* provided commit_ids
 
 
     Args:
     Args:
-      object_store: Object store
+      repo: Repository
       commit_ids:  list of commit ids
       commit_ids:  list of commit ids
     Returns:
     Returns:
       list of lowest common ancestor commit_ids
       list of lowest common ancestor commit_ids
     """
     """
 
 
-    def lookup_parents(commit_id):
-        return object_store[commit_id].parents
-
     if not commit_ids:
     if not commit_ids:
         return []
         return []
     if len(commit_ids) <= 2:
     if len(commit_ids) <= 2:
-        return find_merge_base(object_store, commit_ids)
+        return find_merge_base(repo, commit_ids)
+    parents_provider = repo.parents_provider()
     lcas = [commit_ids[0]]
     lcas = [commit_ids[0]]
     others = commit_ids[1:]
     others = commit_ids[1:]
     for cmt in others:
     for cmt in others:
         next_lcas = []
         next_lcas = []
         for ca in lcas:
         for ca in lcas:
-            res = _find_lcas(lookup_parents, cmt, [ca])
+            res = _find_lcas(parents_provider.get_parents, cmt, [ca])
             next_lcas.extend(res)
             next_lcas.extend(res)
         lcas = next_lcas[:]
         lcas = next_lcas[:]
     return lcas
     return lcas
 
 
 
 
-def can_fast_forward(object_store, c1, c2):
+def can_fast_forward(repo, c1, c2):
     """Is it possible to fast-forward from c1 to c2?
     """Is it possible to fast-forward from c1 to c2?
 
 
     Args:
     Args:
-      object_store: Store to retrieve objects from
+      repo: Repository to retrieve objects from
       c1: Commit id for first commit
       c1: Commit id for first commit
       c2: Commit id for second commit
       c2: Commit id for second commit
     """
     """
     if c1 == c2:
     if c1 == c2:
         return True
         return True
 
 
-    def lookup_parents(commit_id):
-        return object_store[commit_id].parents
-
     # Algorithm: Find the common ancestor
     # Algorithm: Find the common ancestor
-    lcas = _find_lcas(lookup_parents, c1, [c2])
+    parents_provider = repo.parents_provider()
+    lcas = _find_lcas(parents_provider.get_parents, c1, [c2])
     return lcas == [c1]
     return lcas == [c1]

+ 53 - 36
dulwich/ignore.py

@@ -24,11 +24,23 @@ For details for the matching rules, see https://git-scm.com/docs/gitignore
 
 
 import os.path
 import os.path
 import re
 import re
+from typing import (
+    BinaryIO,
+    Iterable,
+    List,
+    Optional,
+    TYPE_CHECKING,
+    Dict,
+    Union,
+    )
 
 
-from dulwich.config import get_xdg_config_home_path
+if TYPE_CHECKING:
+    from dulwich.repo import Repo
 
 
+from dulwich.config import get_xdg_config_home_path, Config
 
 
-def _translate_segment(segment):
+
+def _translate_segment(segment: bytes) -> bytes:
     if segment == b"*":
     if segment == b"*":
         return b'[^/]+'
         return b'[^/]+'
     res = b""
     res = b""
@@ -63,7 +75,7 @@ def _translate_segment(segment):
     return res
     return res
 
 
 
 
-def translate(pat):
+def translate(pat: bytes) -> bytes:
     """Translate a shell PATTERN to a regular expression.
     """Translate a shell PATTERN to a regular expression.
 
 
     There is no way to quote meta-characters.
     There is no way to quote meta-characters.
@@ -100,7 +112,7 @@ def translate(pat):
     return res + b'\\Z'
     return res + b'\\Z'
 
 
 
 
-def read_ignore_patterns(f):
+def read_ignore_patterns(f: BinaryIO) -> Iterable[bytes]:
     """Read a git ignore file.
     """Read a git ignore file.
 
 
     Args:
     Args:
@@ -127,7 +139,8 @@ def read_ignore_patterns(f):
         yield line
         yield line
 
 
 
 
-def match_pattern(path, pattern, ignorecase=False):
+def match_pattern(
+        path: bytes, pattern: bytes, ignorecase: bool = False) -> bool:
     """Match a gitignore-style pattern against a path.
     """Match a gitignore-style pattern against a path.
 
 
     Args:
     Args:
@@ -143,7 +156,7 @@ def match_pattern(path, pattern, ignorecase=False):
 class Pattern(object):
 class Pattern(object):
     """A single ignore pattern."""
     """A single ignore pattern."""
 
 
-    def __init__(self, pattern, ignorecase=False):
+    def __init__(self, pattern: bytes, ignorecase: bool = False):
         self.pattern = pattern
         self.pattern = pattern
         self.ignorecase = ignorecase
         self.ignorecase = ignorecase
         if pattern[0:1] == b'!':
         if pattern[0:1] == b'!':
@@ -158,22 +171,22 @@ class Pattern(object):
             flags = re.IGNORECASE
             flags = re.IGNORECASE
         self._re = re.compile(translate(pattern), flags)
         self._re = re.compile(translate(pattern), flags)
 
 
-    def __bytes__(self):
+    def __bytes__(self) -> bytes:
         return self.pattern
         return self.pattern
 
 
-    def __str__(self):
+    def __str__(self) -> str:
         return os.fsdecode(self.pattern)
         return os.fsdecode(self.pattern)
 
 
-    def __eq__(self, other):
-        return (type(self) == type(other) and
+    def __eq__(self, other: object) -> bool:
+        return (isinstance(other, type(self)) and
                 self.pattern == other.pattern and
                 self.pattern == other.pattern and
                 self.ignorecase == other.ignorecase)
                 self.ignorecase == other.ignorecase)
 
 
-    def __repr__(self):
-        return "%s(%s, %r)" % (
+    def __repr__(self) -> str:
+        return "%s(%r, %r)" % (
             type(self).__name__, self.pattern, self.ignorecase)
             type(self).__name__, self.pattern, self.ignorecase)
 
 
-    def match(self, path):
+    def match(self, path: bytes) -> bool:
         """Try to match a path against this ignore pattern.
         """Try to match a path against this ignore pattern.
 
 
         Args:
         Args:
@@ -185,23 +198,25 @@ class Pattern(object):
 
 
 class IgnoreFilter(object):
 class IgnoreFilter(object):
 
 
-    def __init__(self, patterns, ignorecase=False):
-        self._patterns = []
+    def __init__(self, patterns: Iterable[bytes], ignorecase: bool = False,
+                 path=None):
+        self._patterns = []  # type: List[Pattern]
         self._ignorecase = ignorecase
         self._ignorecase = ignorecase
+        self._path = path
         for pattern in patterns:
         for pattern in patterns:
             self.append_pattern(pattern)
             self.append_pattern(pattern)
 
 
-    def append_pattern(self, pattern):
+    def append_pattern(self, pattern: bytes) -> None:
         """Add a pattern to the set."""
         """Add a pattern to the set."""
         self._patterns.append(Pattern(pattern, self._ignorecase))
         self._patterns.append(Pattern(pattern, self._ignorecase))
 
 
-    def find_matching(self, path):
+    def find_matching(self, path: Union[bytes, str]) -> Iterable[Pattern]:
         """Yield all matching patterns for path.
         """Yield all matching patterns for path.
 
 
         Args:
         Args:
           path: Path to match
           path: Path to match
         Returns:
         Returns:
-          Iterator over  iterators
+          Iterator over iterators
         """
         """
         if not isinstance(path, bytes):
         if not isinstance(path, bytes):
             path = os.fsencode(path)
             path = os.fsencode(path)
@@ -209,7 +224,7 @@ class IgnoreFilter(object):
             if pattern.match(path):
             if pattern.match(path):
                 yield pattern
                 yield pattern
 
 
-    def is_ignored(self, path):
+    def is_ignored(self, path: bytes) -> Optional[bool]:
         """Check whether a path is ignored.
         """Check whether a path is ignored.
 
 
         For directories, include a trailing slash.
         For directories, include a trailing slash.
@@ -223,17 +238,17 @@ class IgnoreFilter(object):
         return status
         return status
 
 
     @classmethod
     @classmethod
-    def from_path(cls, path, ignorecase=False):
+    def from_path(cls, path, ignorecase: bool = False) -> 'IgnoreFilter':
         with open(path, 'rb') as f:
         with open(path, 'rb') as f:
-            ret = cls(read_ignore_patterns(f), ignorecase)
-            ret._path = path
-            return ret
+            return cls(read_ignore_patterns(f), ignorecase, path=path)
 
 
-    def __repr__(self):
-        if getattr(self, '_path', None) is None:
-            return "<%s>" % (type(self).__name__)
+    def __repr__(self) -> str:
+        path = getattr(self, '_path', None)
+        if path is not None:
+            return "%s.from_path(%r)" % (
+                type(self).__name__, path)
         else:
         else:
-            return "%s.from_path(%r)" % (type(self).__name__, self._path)
+            return "<%s>" % (type(self).__name__)
 
 
 
 
 class IgnoreFilterStack(object):
 class IgnoreFilterStack(object):
@@ -242,7 +257,7 @@ class IgnoreFilterStack(object):
     def __init__(self, filters):
     def __init__(self, filters):
         self._filters = filters
         self._filters = filters
 
 
-    def is_ignored(self, path):
+    def is_ignored(self, path: str) -> Optional[bool]:
         """Check whether a path is explicitly included or excluded in ignores.
         """Check whether a path is explicitly included or excluded in ignores.
 
 
         Args:
         Args:
@@ -259,7 +274,7 @@ class IgnoreFilterStack(object):
         return status
         return status
 
 
 
 
-def default_user_ignore_filter_path(config):
+def default_user_ignore_filter_path(config: Config) -> str:
     """Return default user ignore filter path.
     """Return default user ignore filter path.
 
 
     Args:
     Args:
@@ -278,19 +293,21 @@ def default_user_ignore_filter_path(config):
 class IgnoreFilterManager(object):
 class IgnoreFilterManager(object):
     """Ignore file manager."""
     """Ignore file manager."""
 
 
-    def __init__(self, top_path, global_filters, ignorecase):
-        self._path_filters = {}
+    def __init__(
+            self, top_path: str, global_filters: List[IgnoreFilter],
+            ignorecase: bool):
+        self._path_filters = {}  # type: Dict[str, Optional[IgnoreFilter]]
         self._top_path = top_path
         self._top_path = top_path
         self._global_filters = global_filters
         self._global_filters = global_filters
         self._ignorecase = ignorecase
         self._ignorecase = ignorecase
 
 
-    def __repr__(self):
+    def __repr__(self) -> str:
         return "%s(%s, %r, %r)" % (
         return "%s(%s, %r, %r)" % (
             type(self).__name__, self._top_path,
             type(self).__name__, self._top_path,
             self._global_filters,
             self._global_filters,
             self._ignorecase)
             self._ignorecase)
 
 
-    def _load_path(self, path):
+    def _load_path(self, path: str) -> Optional[IgnoreFilter]:
         try:
         try:
             return self._path_filters[path]
             return self._path_filters[path]
         except KeyError:
         except KeyError:
@@ -304,7 +321,7 @@ class IgnoreFilterManager(object):
             self._path_filters[path] = None
             self._path_filters[path] = None
         return self._path_filters[path]
         return self._path_filters[path]
 
 
-    def find_matching(self, path):
+    def find_matching(self, path: str) -> Iterable[Pattern]:
         """Find matching patterns for path.
         """Find matching patterns for path.
 
 
         Stops after the first ignore file with matches.
         Stops after the first ignore file with matches.
@@ -336,7 +353,7 @@ class IgnoreFilterManager(object):
                 filters.insert(0, (i, ignore_filter))
                 filters.insert(0, (i, ignore_filter))
         return iter([])
         return iter([])
 
 
-    def is_ignored(self, path):
+    def is_ignored(self, path: str) -> Optional[bool]:
         """Check whether a path is explicitly included or excluded in ignores.
         """Check whether a path is explicitly included or excluded in ignores.
 
 
         Args:
         Args:
@@ -351,7 +368,7 @@ class IgnoreFilterManager(object):
         return None
         return None
 
 
     @classmethod
     @classmethod
-    def from_repo(cls, repo):
+    def from_repo(cls, repo: 'Repo') -> 'IgnoreFilterManager':
         """Create a IgnoreFilterManager from a repository.
         """Create a IgnoreFilterManager from a repository.
 
 
         Args:
         Args:

+ 64 - 25
dulwich/index.py

@@ -25,6 +25,21 @@ import os
 import stat
 import stat
 import struct
 import struct
 import sys
 import sys
+from typing import (
+    Any,
+    BinaryIO,
+    Callable,
+    Dict,
+    List,
+    Optional,
+    TYPE_CHECKING,
+    Iterable,
+    Iterator,
+    Tuple,
+    )
+
+if TYPE_CHECKING:
+    from dulwich.object_store import BaseObjectStore
 
 
 from dulwich.file import GitFile
 from dulwich.file import GitFile
 from dulwich.objects import (
 from dulwich.objects import (
@@ -52,6 +67,9 @@ FLAG_VALID = 0x8000
 FLAG_EXTENDED = 0x4000
 FLAG_EXTENDED = 0x4000
 
 
 
 
+DEFAULT_VERSION = 2
+
+
 def pathsplit(path):
 def pathsplit(path):
     """Split a /-delimited path into a directory part and a basename.
     """Split a /-delimited path into a directory part and a basename.
 
 
@@ -145,7 +163,7 @@ def write_cache_entry(f, entry):
     f.write(b'\0' * ((beginoffset + real_size) - f.tell()))
     f.write(b'\0' * ((beginoffset + real_size) - f.tell()))
 
 
 
 
-def read_index(f):
+def read_index(f: BinaryIO):
     """Read an index file, yielding the individual entries."""
     """Read an index file, yielding the individual entries."""
     header = f.read(4)
     header = f.read(4)
     if header != b'DIRC':
     if header != b'DIRC':
@@ -168,36 +186,45 @@ def read_index_dict(f):
     return ret
     return ret
 
 
 
 
-def write_index(f, entries):
+def write_index(
+        f: BinaryIO,
+        entries: List[Any], version: Optional[int] = None):
     """Write an index file.
     """Write an index file.
 
 
     Args:
     Args:
       f: File-like object to write to
       f: File-like object to write to
+      version: Version number to write
       entries: Iterable over the entries to write
       entries: Iterable over the entries to write
     """
     """
+    if version is None:
+        version = DEFAULT_VERSION
     f.write(b'DIRC')
     f.write(b'DIRC')
-    f.write(struct.pack(b'>LL', 2, len(entries)))
+    f.write(struct.pack(b'>LL', version, len(entries)))
     for x in entries:
     for x in entries:
         write_cache_entry(f, x)
         write_cache_entry(f, x)
 
 
 
 
-def write_index_dict(f, entries):
+def write_index_dict(
+        f: BinaryIO, entries: Dict[bytes, IndexEntry],
+        version: Optional[int] = None) -> None:
     """Write an index file based on the contents of a dictionary.
     """Write an index file based on the contents of a dictionary.
 
 
     """
     """
     entries_list = []
     entries_list = []
     for name in sorted(entries):
     for name in sorted(entries):
         entries_list.append((name,) + tuple(entries[name]))
         entries_list.append((name,) + tuple(entries[name]))
-    write_index(f, entries_list)
+    write_index(f, entries_list, version=version)
 
 
 
 
-def cleanup_mode(mode):
+def cleanup_mode(mode: int) -> int:
     """Cleanup a mode value.
     """Cleanup a mode value.
 
 
     This will return a mode that can be stored in a tree object.
     This will return a mode that can be stored in a tree object.
 
 
     Args:
     Args:
       mode: Mode to clean up.
       mode: Mode to clean up.
+    Returns:
+      mode
     """
     """
     if stat.S_ISLNK(mode):
     if stat.S_ISLNK(mode):
         return stat.S_IFLNK
         return stat.S_IFLNK
@@ -221,6 +248,8 @@ class Index(object):
           filename: Path to the index file
           filename: Path to the index file
         """
         """
         self._filename = filename
         self._filename = filename
+        # TODO(jelmer): Store the version returned by read_index
+        self._version = None
         self.clear()
         self.clear()
         self.read()
         self.read()
 
 
@@ -231,12 +260,12 @@ class Index(object):
     def __repr__(self):
     def __repr__(self):
         return "%s(%r)" % (self.__class__.__name__, self._filename)
         return "%s(%r)" % (self.__class__.__name__, self._filename)
 
 
-    def write(self):
+    def write(self) -> None:
         """Write current contents of index to disk."""
         """Write current contents of index to disk."""
         f = GitFile(self._filename, 'wb')
         f = GitFile(self._filename, 'wb')
         try:
         try:
             f = SHA1Writer(f)
             f = SHA1Writer(f)
-            write_index_dict(f, self._byname)
+            write_index_dict(f, self._byname, version=self._version)
         finally:
         finally:
             f.close()
             f.close()
 
 
@@ -255,11 +284,11 @@ class Index(object):
         finally:
         finally:
             f.close()
             f.close()
 
 
-    def __len__(self):
+    def __len__(self) -> int:
         """Number of entries in this index file."""
         """Number of entries in this index file."""
         return len(self._byname)
         return len(self._byname)
 
 
-    def __getitem__(self, name):
+    def __getitem__(self, name: bytes) -> IndexEntry:
         """Retrieve entry by relative path.
         """Retrieve entry by relative path.
 
 
         Returns: tuple with (ctime, mtime, dev, ino, mode, uid, gid, size, sha,
         Returns: tuple with (ctime, mtime, dev, ino, mode, uid, gid, size, sha,
@@ -267,19 +296,19 @@ class Index(object):
         """
         """
         return self._byname[name]
         return self._byname[name]
 
 
-    def __iter__(self):
+    def __iter__(self) -> Iterator[bytes]:
         """Iterate over the paths in this index."""
         """Iterate over the paths in this index."""
         return iter(self._byname)
         return iter(self._byname)
 
 
-    def get_sha1(self, path):
+    def get_sha1(self, path: bytes) -> bytes:
         """Return the (git object) SHA1 for the object at a path."""
         """Return the (git object) SHA1 for the object at a path."""
         return self[path].sha
         return self[path].sha
 
 
-    def get_mode(self, path):
+    def get_mode(self, path: bytes) -> int:
         """Return the POSIX file mode for the object at a path."""
         """Return the POSIX file mode for the object at a path."""
         return self[path].mode
         return self[path].mode
 
 
-    def iterobjects(self):
+    def iterobjects(self) -> Iterable[Tuple[bytes, bytes, int]]:
         """Iterate over path, sha, mode tuples for use with commit_tree."""
         """Iterate over path, sha, mode tuples for use with commit_tree."""
         for path in self:
         for path in self:
             entry = self[path]
             entry = self[path]
@@ -343,7 +372,9 @@ class Index(object):
         return commit_tree(object_store, self.iterobjects())
         return commit_tree(object_store, self.iterobjects())
 
 
 
 
-def commit_tree(object_store, blobs):
+def commit_tree(
+        object_store: 'BaseObjectStore',
+        blobs: Iterable[Tuple[bytes, bytes, int]]) -> bytes:
     """Commit a new tree.
     """Commit a new tree.
 
 
     Args:
     Args:
@@ -352,8 +383,7 @@ def commit_tree(object_store, blobs):
     Returns:
     Returns:
       SHA1 of the created tree.
       SHA1 of the created tree.
     """
     """
-
-    trees = {b'': {}}
+    trees = {b'': {}}  # type: Dict[bytes, Any]
 
 
     def add_tree(path):
     def add_tree(path):
         if path in trees:
         if path in trees:
@@ -385,7 +415,7 @@ def commit_tree(object_store, blobs):
     return build_tree(b'')
     return build_tree(b'')
 
 
 
 
-def commit_index(object_store, index):
+def commit_index(object_store: 'BaseObjectStore', index: Index) -> bytes:
     """Create a new tree from an index.
     """Create a new tree from an index.
 
 
     Args:
     Args:
@@ -397,8 +427,15 @@ def commit_index(object_store, index):
     return commit_tree(object_store, index.iterobjects())
     return commit_tree(object_store, index.iterobjects())
 
 
 
 
-def changes_from_tree(names, lookup_entry, object_store, tree,
-                      want_unchanged=False):
+def changes_from_tree(
+        names: Iterable[bytes],
+        lookup_entry: Callable[[bytes], Tuple[bytes, int]],
+        object_store: 'BaseObjectStore', tree: Optional[bytes],
+        want_unchanged=False) -> Iterable[
+            Tuple[
+                Tuple[Optional[bytes], Optional[bytes]],
+                Tuple[Optional[int], Optional[int]],
+                Tuple[Optional[bytes], Optional[bytes]]]]:
     """Find the differences between the contents of a tree and
     """Find the differences between the contents of a tree and
     a working copy.
     a working copy.
 
 
@@ -436,7 +473,9 @@ def changes_from_tree(names, lookup_entry, object_store, tree,
             yield ((None, name), (None, other_mode), (None, other_sha))
             yield ((None, name), (None, other_mode), (None, other_sha))
 
 
 
 
-def index_entry_from_stat(stat_val, hex_sha, flags, mode=None):
+def index_entry_from_stat(
+        stat_val, hex_sha: bytes, flags: int,
+        mode: Optional[int] = None):
     """Create a new index entry from a stat value.
     """Create a new index entry from a stat value.
 
 
     Args:
     Args:
@@ -659,7 +698,7 @@ def _has_directory_changed(tree_path, entry):
     return False
     return False
 
 
 
 
-def get_unstaged_changes(index, root_path, filter_blob_callback=None):
+def get_unstaged_changes(index: Index, root_path, filter_blob_callback=None):
     """Walk through an index and check for differences against working tree.
     """Walk through an index and check for differences against working tree.
 
 
     Args:
     Args:
@@ -699,7 +738,7 @@ def get_unstaged_changes(index, root_path, filter_blob_callback=None):
 os_sep_bytes = os.sep.encode('ascii')
 os_sep_bytes = os.sep.encode('ascii')
 
 
 
 
-def _tree_to_fs_path(root_path, tree_path):
+def _tree_to_fs_path(root_path, tree_path: bytes):
     """Convert a git tree path to a file system path.
     """Convert a git tree path to a file system path.
 
 
     Args:
     Args:
@@ -768,7 +807,8 @@ def index_entry_from_path(path, object_store=None):
     return None
     return None
 
 
 
 
-def iter_fresh_entries(paths, root_path, object_store=None):
+def iter_fresh_entries(
+        paths, root_path, object_store: Optional['BaseObjectStore'] = None):
     """Iterate over current versions of index entries on disk.
     """Iterate over current versions of index entries on disk.
 
 
     Args:
     Args:
@@ -814,7 +854,6 @@ def iter_fresh_objects(paths, root_path, include_deleted=False,
     """Iterate over versions of objecs on disk referenced by index.
     """Iterate over versions of objecs on disk referenced by index.
 
 
     Args:
     Args:
-      index: Index file
       root_path: Root path to access from
       root_path: Root path to access from
       include_deleted: Include deleted entries with sha and
       include_deleted: Include deleted entries with sha and
         mode set to None
         mode set to None

+ 10 - 6
dulwich/pack.py

@@ -1922,20 +1922,24 @@ class Pack(object):
         self.resolve_ext_ref = resolve_ext_ref
         self.resolve_ext_ref = resolve_ext_ref
 
 
     @classmethod
     @classmethod
-    def from_lazy_objects(self, data_fn, idx_fn):
+    def from_lazy_objects(cls, data_fn, idx_fn):
         """Create a new pack object from callables to load pack data and
         """Create a new pack object from callables to load pack data and
         index objects."""
         index objects."""
-        ret = Pack('')
+        ret = cls('')
         ret._data_load = data_fn
         ret._data_load = data_fn
         ret._idx_load = idx_fn
         ret._idx_load = idx_fn
         return ret
         return ret
 
 
     @classmethod
     @classmethod
-    def from_objects(self, data, idx):
+    def from_objects(cls, data, idx):
         """Create a new pack object from pack data and index objects."""
         """Create a new pack object from pack data and index objects."""
-        ret = Pack('')
-        ret._data_load = lambda: data
-        ret._idx_load = lambda: idx
+        ret = cls('')
+        ret._data = data
+        ret._data.pack = ret
+        ret._data_load = None
+        ret._idx = idx
+        ret._idx_load = None
+        ret.check_length_and_checksum()
         return ret
         return ret
 
 
     def name(self):
     def name(self):

+ 64 - 22
dulwich/porcelain.py

@@ -219,29 +219,58 @@ def path_to_tree_path(repopath, path, tree_encoding=DEFAULT_ENCODING):
       path: A path, absolute or relative to the cwd
       path: A path, absolute or relative to the cwd
     Returns: A path formatted for use in e.g. an index
     Returns: A path formatted for use in e.g. an index
     """
     """
-    path = Path(path).resolve()
-    repopath = Path(repopath).resolve()
-    relpath = path.relative_to(repopath)
-    if sys.platform == 'win32':
-        return str(relpath).replace(os.path.sep, '/').encode(tree_encoding)
+    # Pathlib resolve before Python 3.6 could raises FileNotFoundError in case
+    # there is no file matching the path so we reuse the old implementation for
+    # Python 3.5
+    if sys.version_info < (3, 6):
+        if not isinstance(path, bytes):
+            path = os.fsencode(path)
+        if not isinstance(repopath, bytes):
+            repopath = os.fsencode(repopath)
+        treepath = os.path.relpath(path, repopath)
+        if treepath.startswith(b'..'):
+            err_msg = 'Path %r not in repo path (%r)' % (path, repopath)
+            raise ValueError(err_msg)
+        if os.path.sep != '/':
+            treepath = treepath.replace(os.path.sep.encode('ascii'), b'/')
+        return treepath
     else:
     else:
-        return bytes(relpath)
+        # Resolve might returns a relative path on Windows
+        # https://bugs.python.org/issue38671
+        if sys.platform == 'win32':
+            path = os.path.abspath(path)
+
+        path = Path(path).resolve()
+
+        # Resolve and abspath seems to behave differently regarding symlinks,
+        # as we are doing abspath on the file path, we need to do the same on
+        # the repo path or they might not match
+        if sys.platform == 'win32':
+            repopath = os.path.abspath(repopath)
+
+        repopath = Path(repopath).resolve()
+
+        relpath = path.relative_to(repopath)
+        if sys.platform == 'win32':
+            return str(relpath).replace(os.path.sep, '/').encode(tree_encoding)
+        else:
+            return bytes(relpath)
 
 
 
 
 class DivergedBranches(Error):
 class DivergedBranches(Error):
     """Branches have diverged and fast-forward is not possible."""
     """Branches have diverged and fast-forward is not possible."""
 
 
 
 
-def check_diverged(store, current_sha, new_sha):
+def check_diverged(repo, current_sha, new_sha):
     """Check if updating to a sha can be done with fast forwarding.
     """Check if updating to a sha can be done with fast forwarding.
 
 
     Args:
     Args:
-      store: Object store
+      repo: Repository object
       current_sha: Current head sha
       current_sha: Current head sha
       new_sha: New head sha
       new_sha: New head sha
     """
     """
     try:
     try:
-        can = can_fast_forward(store, current_sha, new_sha)
+        can = can_fast_forward(repo, current_sha, new_sha)
     except KeyError:
     except KeyError:
         can = False
         can = False
     if not can:
     if not can:
@@ -480,6 +509,13 @@ def clean(repo=".", target_dir=None):
         if not _is_subdir(target_dir, r.path):
         if not _is_subdir(target_dir, r.path):
             raise Error("target_dir must be in the repo's working dir")
             raise Error("target_dir must be in the repo's working dir")
 
 
+        config = r.get_config_stack()
+        require_force = config.get_boolean(   # noqa: F841
+            (b'clean',), b'requireForce', True)
+
+        # TODO(jelmer): if require_force is set, then make sure that -f, -i or
+        # -n is specified.
+
         index = r.open_index()
         index = r.open_index()
         ignore_manager = IgnoreFilterManager.from_repo(r)
         ignore_manager = IgnoreFilterManager.from_repo(r)
 
 
@@ -928,7 +964,6 @@ def get_remote_repo(
         encoded_location = url
         encoded_location = url
     else:
     else:
         remote_name = None
         remote_name = None
-        config = None
 
 
     return (remote_name, encoded_location.decode())
     return (remote_name, encoded_location.decode())
 
 
@@ -969,33 +1004,40 @@ def push(repo, remote_location=None, refspecs=None,
                     new_refs[rh] = ZERO_SHA
                     new_refs[rh] = ZERO_SHA
                     remote_changed_refs[rh] = None
                     remote_changed_refs[rh] = None
                 else:
                 else:
-                    if not force_ref:
-                        check_diverged(r.object_store, refs[rh], r.refs[lh])
-                    new_refs[rh] = r.refs[lh]
-                    remote_changed_refs[rh] = r.refs[lh]
+                    try:
+                        localsha = r.refs[lh]
+                    except KeyError:
+                        raise Error(
+                            'No valid ref %s in local repository' % lh)
+                    if not force_ref and rh in refs:
+                        check_diverged(r, refs[rh], localsha)
+                    new_refs[rh] = localsha
+                    remote_changed_refs[rh] = localsha
             return new_refs
             return new_refs
 
 
         err_encoding = getattr(errstream, 'encoding', None) or DEFAULT_ENCODING
         err_encoding = getattr(errstream, 'encoding', None) or DEFAULT_ENCODING
-        remote_location_bytes = client.get_url(path).encode(err_encoding)
+        remote_location = client.get_url(path)
         try:
         try:
             result = client.send_pack(
             result = client.send_pack(
                 path, update_refs,
                 path, update_refs,
                 generate_pack_data=r.generate_pack_data,
                 generate_pack_data=r.generate_pack_data,
                 progress=errstream.write)
                 progress=errstream.write)
-            errstream.write(
-                b"Push to " + remote_location_bytes + b" successful.\n")
         except SendPackError as e:
         except SendPackError as e:
             raise Error(
             raise Error(
-                "Push to " + remote_location_bytes +
+                "Push to " + remote_location +
                 " failed -> " + e.args[0].decode(), inner=e)
                 " failed -> " + e.args[0].decode(), inner=e)
+        else:
+            errstream.write(
+                b"Push to " +
+                remote_location.encode(err_encoding) + b" successful.\n")
 
 
         for ref, error in (result.ref_status or {}).items():
         for ref, error in (result.ref_status or {}).items():
-            if status is not None:
+            if error is not None:
                 errstream.write(
                 errstream.write(
-                    b"Push of ref %s failed: %s" %
+                    b"Push of ref %s failed: %s\n" %
                     (ref, error.encode(err_encoding)))
                     (ref, error.encode(err_encoding)))
             else:
             else:
-                errstream.write(b'Ref %s updated' % ref)
+                errstream.write(b'Ref %s updated\n' % ref)
 
 
         if remote_name is not None:
         if remote_name is not None:
             _import_remote_refs(r.refs, remote_name, remote_changed_refs)
             _import_remote_refs(r.refs, remote_name, remote_changed_refs)
@@ -1035,7 +1077,7 @@ def pull(repo, remote_location=None, refspecs=None,
         for (lh, rh, force_ref) in selected_refs:
         for (lh, rh, force_ref) in selected_refs:
             try:
             try:
                 check_diverged(
                 check_diverged(
-                    r.object_store, r.refs[rh], fetch_result.refs[lh])
+                    r, r.refs[rh], fetch_result.refs[lh])
             except DivergedBranches:
             except DivergedBranches:
                 if fast_forward:
                 if fast_forward:
                     raise
                     raise

+ 91 - 3
dulwich/refs.py

@@ -390,6 +390,35 @@ class RefsContainer(object):
                 ret[src] = dst
                 ret[src] = dst
         return ret
         return ret
 
 
+    def watch(self):
+        """Watch for changes to the refs in this container.
+
+        Returns a context manager that yields tuples with (refname, new_sha)
+        """
+        raise NotImplementedError(self.watch)
+
+
+class _DictRefsWatcher(object):
+
+    def __init__(self, refs):
+        self._refs = refs
+
+    def __enter__(self):
+        from queue import Queue
+        self.queue = Queue()
+        self._refs._watchers.add(self)
+        return self
+
+    def __next__(self):
+        return self.queue.get()
+
+    def _notify(self, entry):
+        self.queue.put_nowait(entry)
+
+    def __exit__(self, exc_type, exc_val, exc_tb):
+        self._refs._watchers.remove(self)
+        return False
+
 
 
 class DictRefsContainer(RefsContainer):
 class DictRefsContainer(RefsContainer):
     """RefsContainer backed by a simple dict.
     """RefsContainer backed by a simple dict.
@@ -402,6 +431,7 @@ class DictRefsContainer(RefsContainer):
         super(DictRefsContainer, self).__init__(logger=logger)
         super(DictRefsContainer, self).__init__(logger=logger)
         self._refs = refs
         self._refs = refs
         self._peeled = {}
         self._peeled = {}
+        self._watchers = set()
 
 
     def allkeys(self):
     def allkeys(self):
         return self._refs.keys()
         return self._refs.keys()
@@ -412,11 +442,20 @@ class DictRefsContainer(RefsContainer):
     def get_packed_refs(self):
     def get_packed_refs(self):
         return {}
         return {}
 
 
+    def _notify(self, ref, newsha):
+        for watcher in self._watchers:
+            watcher._notify((ref, newsha))
+
+    def watch(self):
+        return _DictRefsWatcher(self)
+
     def set_symbolic_ref(self, name, other, committer=None,
     def set_symbolic_ref(self, name, other, committer=None,
                          timestamp=None, timezone=None, message=None):
                          timestamp=None, timezone=None, message=None):
         old = self.follow(name)[-1]
         old = self.follow(name)[-1]
-        self._refs[name] = SYMREF + other
-        self._log(name, old, old, committer=committer, timestamp=timestamp,
+        new = SYMREF + other
+        self._refs[name] = new
+        self._notify(name, new)
+        self._log(name, old, new, committer=committer, timestamp=timestamp,
                   timezone=timezone, message=message)
                   timezone=timezone, message=message)
 
 
     def set_if_equals(self, name, old_ref, new_ref, committer=None,
     def set_if_equals(self, name, old_ref, new_ref, committer=None,
@@ -428,6 +467,7 @@ class DictRefsContainer(RefsContainer):
             self._check_refname(realname)
             self._check_refname(realname)
             old = self._refs.get(realname)
             old = self._refs.get(realname)
             self._refs[realname] = new_ref
             self._refs[realname] = new_ref
+            self._notify(realname, new_ref)
             self._log(realname, old, new_ref, committer=committer,
             self._log(realname, old, new_ref, committer=committer,
                       timestamp=timestamp, timezone=timezone, message=message)
                       timestamp=timestamp, timezone=timezone, message=message)
         return True
         return True
@@ -437,6 +477,7 @@ class DictRefsContainer(RefsContainer):
         if name in self._refs:
         if name in self._refs:
             return False
             return False
         self._refs[name] = ref
         self._refs[name] = ref
+        self._notify(name, ref)
         self._log(name, None, ref, committer=committer, timestamp=timestamp,
         self._log(name, None, ref, committer=committer, timestamp=timestamp,
                   timezone=timezone, message=message)
                   timezone=timezone, message=message)
         return True
         return True
@@ -450,6 +491,7 @@ class DictRefsContainer(RefsContainer):
         except KeyError:
         except KeyError:
             pass
             pass
         else:
         else:
+            self._notify(name, None)
             self._log(name, old, None, committer=committer,
             self._log(name, old, None, committer=committer,
                       timestamp=timestamp, timezone=timezone, message=message)
                       timestamp=timestamp, timezone=timezone, message=message)
         return True
         return True
@@ -461,7 +503,8 @@ class DictRefsContainer(RefsContainer):
         """Update multiple refs; intended only for testing."""
         """Update multiple refs; intended only for testing."""
         # TODO(dborowitz): replace this with a public function that uses
         # TODO(dborowitz): replace this with a public function that uses
         # set_if_equal.
         # set_if_equal.
-        self._refs.update(refs)
+        for ref, sha in refs.items():
+            self.set_if_equals(ref, None, sha)
 
 
     def _update_peeled(self, peeled):
     def _update_peeled(self, peeled):
         """Update cached peeled refs; intended only for testing."""
         """Update cached peeled refs; intended only for testing."""
@@ -502,6 +545,47 @@ class InfoRefsContainer(RefsContainer):
             return self._refs[name]
             return self._refs[name]
 
 
 
 
+class _InotifyRefsWatcher(object):
+
+    def __init__(self, path):
+        import pyinotify
+        from queue import Queue
+        self.path = os.fsdecode(path)
+        self.manager = pyinotify.WatchManager()
+        self.manager.add_watch(
+            self.path, pyinotify.IN_DELETE |
+            pyinotify.IN_CLOSE_WRITE | pyinotify.IN_MOVED_TO, rec=True,
+            auto_add=True)
+
+        self.notifier = pyinotify.ThreadedNotifier(
+            self.manager, default_proc_fun=self._notify)
+        self.queue = Queue()
+
+    def _notify(self, event):
+        if event.dir:
+            return
+        if event.pathname.endswith('.lock'):
+            return
+        ref = os.fsencode(os.path.relpath(event.pathname, self.path))
+        if event.maskname == 'IN_DELETE':
+            self.queue.put_nowait((ref, None))
+        elif event.maskname in ('IN_CLOSE_WRITE', 'IN_MOVED_TO'):
+            with open(event.pathname, 'rb') as f:
+                sha = f.readline().rstrip(b'\n\r')
+                self.queue.put_nowait((ref, sha))
+
+    def __next__(self):
+        return self.queue.get()
+
+    def __enter__(self):
+        self.notifier.start()
+        return self
+
+    def __exit__(self, exc_type, exc_val, exc_tb):
+        self.notifier.stop()
+        return False
+
+
 class DiskRefsContainer(RefsContainer):
 class DiskRefsContainer(RefsContainer):
     """Refs container that reads refs from disk."""
     """Refs container that reads refs from disk."""
 
 
@@ -848,6 +932,10 @@ class DiskRefsContainer(RefsContainer):
 
 
         return True
         return True
 
 
+    def watch(self):
+        import pyinotify  # noqa: F401
+        return _InotifyRefsWatcher(self.path)
+
 
 
 def _split_ref_line(line):
 def _split_ref_line(line):
     """Split a single ref line into a tuple of SHA1 and name."""
     """Split a single ref line into a tuple of SHA1 and name."""

+ 106 - 52
dulwich/repo.py

@@ -33,6 +33,14 @@ import os
 import sys
 import sys
 import stat
 import stat
 import time
 import time
+from typing import Optional, Tuple, TYPE_CHECKING, List, Dict, Union, Iterable
+
+if TYPE_CHECKING:
+    # There are no circular imports here, but we try to defer imports as long
+    # as possible to reduce start-up time for anything that doesn't need
+    # these imports.
+    from dulwich.config import StackedConfig, ConfigFile
+    from dulwich.index import Index
 
 
 from dulwich.errors import (
 from dulwich.errors import (
     NoIndexPresent,
     NoIndexPresent,
@@ -51,10 +59,12 @@ from dulwich.file import (
 from dulwich.object_store import (
 from dulwich.object_store import (
     DiskObjectStore,
     DiskObjectStore,
     MemoryObjectStore,
     MemoryObjectStore,
+    BaseObjectStore,
     ObjectStoreGraphWalker,
     ObjectStoreGraphWalker,
     )
     )
 from dulwich.objects import (
 from dulwich.objects import (
     check_hexsha,
     check_hexsha,
+    valid_hexsha,
     Blob,
     Blob,
     Commit,
     Commit,
     ShaFile,
     ShaFile,
@@ -66,6 +76,7 @@ from dulwich.pack import (
     )
     )
 
 
 from dulwich.hooks import (
 from dulwich.hooks import (
+    Hook,
     PreCommitShellHook,
     PreCommitShellHook,
     PostCommitShellHook,
     PostCommitShellHook,
     CommitMsgShellHook,
     CommitMsgShellHook,
@@ -120,7 +131,7 @@ class InvalidUserIdentity(Exception):
         self.identity = identity
         self.identity = identity
 
 
 
 
-def _get_default_identity():
+def _get_default_identity() -> Tuple[str, str]:
     import getpass
     import getpass
     import socket
     import socket
     username = getpass.getuser()
     username = getpass.getuser()
@@ -143,19 +154,38 @@ def _get_default_identity():
     return (fullname, email)
     return (fullname, email)
 
 
 
 
-def get_user_identity(config, kind=None):
+def get_user_identity(
+        config: 'StackedConfig',
+        kind: Optional[str] = None) -> bytes:
     """Determine the identity to use for new commits.
     """Determine the identity to use for new commits.
+
+    If kind is set, this first checks
+    GIT_${KIND}_NAME and GIT_${KIND}_EMAIL.
+
+    If those variables are not set, then it will fall back
+    to reading the user.name and user.email settings from
+    the specified configuration.
+
+    If that also fails, then it will fall back to using
+    the current users' identity as obtained from the host
+    system (e.g. the gecos field, $EMAIL, $USER@$(hostname -f).
+
+    Args:
+      kind: Optional kind to return identity for,
+        usually either "AUTHOR" or "COMMITTER".
+
+    Returns:
+      A user identity
     """
     """
+    user = None  # type: Optional[bytes]
+    email = None  # type: Optional[bytes]
     if kind:
     if kind:
-        user = os.environ.get("GIT_" + kind + "_NAME")
-        if user is not None:
-            user = user.encode('utf-8')
-        email = os.environ.get("GIT_" + kind + "_EMAIL")
-        if email is not None:
-            email = email.encode('utf-8')
-    else:
-        user = None
-        email = None
+        user_uc = os.environ.get("GIT_" + kind + "_NAME")
+        if user_uc is not None:
+            user = user_uc.encode('utf-8')
+        email_uc = os.environ.get("GIT_" + kind + "_EMAIL")
+        if email_uc is not None:
+            email = email_uc.encode('utf-8')
     if user is None:
     if user is None:
         try:
         try:
             user = config.get(("user", ), "name")
             user = config.get(("user", ), "name")
@@ -168,16 +198,12 @@ def get_user_identity(config, kind=None):
             email = None
             email = None
     default_user, default_email = _get_default_identity()
     default_user, default_email = _get_default_identity()
     if user is None:
     if user is None:
-        user = default_user
-        if not isinstance(user, bytes):
-            user = user.encode('utf-8')
+        user = default_user.encode('utf-8')
     if email is None:
     if email is None:
-        email = default_email
-        if not isinstance(email, bytes):
-            email = email.encode('utf-8')
+        email = default_email.encode('utf-8')
     if email.startswith(b'<') and email.endswith(b'>'):
     if email.startswith(b'<') and email.endswith(b'>'):
         email = email[1:-1]
         email = email[1:-1]
-    return (user + b" <" + email + b">")
+    return user + b" <" + email + b">"
 
 
 
 
 def check_user_identity(identity):
 def check_user_identity(identity):
@@ -196,7 +222,8 @@ def check_user_identity(identity):
         raise InvalidUserIdentity(identity)
         raise InvalidUserIdentity(identity)
 
 
 
 
-def parse_graftpoints(graftpoints):
+def parse_graftpoints(
+        graftpoints: Iterable[bytes]) -> Dict[bytes, List[bytes]]:
     """Convert a list of graftpoints into a dict
     """Convert a list of graftpoints into a dict
 
 
     Args:
     Args:
@@ -227,7 +254,7 @@ def parse_graftpoints(graftpoints):
     return grafts
     return grafts
 
 
 
 
-def serialize_graftpoints(graftpoints):
+def serialize_graftpoints(graftpoints: Dict[bytes, List[bytes]]) -> bytes:
     """Convert a dictionary of grafts into string
     """Convert a dictionary of grafts into string
 
 
     The graft dictionary is:
     The graft dictionary is:
@@ -270,6 +297,25 @@ def _set_filesystem_hidden(path):
     # Could implement other platform specific filesytem hiding here
     # Could implement other platform specific filesytem hiding here
 
 
 
 
+class ParentsProvider(object):
+
+    def __init__(self, store, grafts={}, shallows=[]):
+        self.store = store
+        self.grafts = grafts
+        self.shallows = set(shallows)
+
+    def get_parents(self, commit_id, commit=None):
+        try:
+            return self.grafts[commit_id]
+        except KeyError:
+            pass
+        if commit_id in self.shallows:
+            return []
+        if commit is None:
+            commit = self.store[commit_id]
+        return commit.parents
+
+
 class BaseRepo(object):
 class BaseRepo(object):
     """Base class for a git repository.
     """Base class for a git repository.
 
 
@@ -279,7 +325,7 @@ class BaseRepo(object):
         repository
         repository
     """
     """
 
 
-    def __init__(self, object_store, refs):
+    def __init__(self, object_store: BaseObjectStore, refs: RefsContainer):
         """Open a repository.
         """Open a repository.
 
 
         This shouldn't be called directly, but rather through one of the
         This shouldn't be called directly, but rather through one of the
@@ -292,17 +338,17 @@ class BaseRepo(object):
         self.object_store = object_store
         self.object_store = object_store
         self.refs = refs
         self.refs = refs
 
 
-        self._graftpoints = {}
-        self.hooks = {}
+        self._graftpoints = {}  # type: Dict[bytes, List[bytes]]
+        self.hooks = {}  # type: Dict[str, Hook]
 
 
-    def _determine_file_mode(self):
+    def _determine_file_mode(self) -> bool:
         """Probe the file-system to determine whether permissions can be trusted.
         """Probe the file-system to determine whether permissions can be trusted.
 
 
         Returns: True if permissions can be trusted, False otherwise.
         Returns: True if permissions can be trusted, False otherwise.
         """
         """
         raise NotImplementedError(self._determine_file_mode)
         raise NotImplementedError(self._determine_file_mode)
 
 
-    def _init_files(self, bare):
+    def _init_files(self, bare: bool) -> None:
         """Initialize a default set of named files."""
         """Initialize a default set of named files."""
         from dulwich.config import ConfigFile
         from dulwich.config import ConfigFile
         self._put_named_file('description', b"Unnamed repository")
         self._put_named_file('description', b"Unnamed repository")
@@ -460,10 +506,11 @@ class BaseRepo(object):
             # commits aren't missing.
             # commits aren't missing.
             haves = []
             haves = []
 
 
+        parents_provider = ParentsProvider(
+            self.object_store, shallows=shallows)
+
         def get_parents(commit):
         def get_parents(commit):
-            if commit.id in shallows:
-                return []
-            return self.get_parents(commit.id, commit)
+            return parents_provider.get_parents(commit.id, commit)
 
 
         return self.object_store.iter_shas(
         return self.object_store.iter_shas(
           self.object_store.find_missing_objects(
           self.object_store.find_missing_objects(
@@ -498,17 +545,18 @@ class BaseRepo(object):
             heads = [
             heads = [
                 sha for sha in self.refs.as_dict(b'refs/heads').values()
                 sha for sha in self.refs.as_dict(b'refs/heads').values()
                 if sha in self.object_store]
                 if sha in self.object_store]
+        parents_provider = ParentsProvider(self.object_store)
         return ObjectStoreGraphWalker(
         return ObjectStoreGraphWalker(
-            heads, self.get_parents, shallow=self.get_shallow())
+            heads, parents_provider.get_parents, shallow=self.get_shallow())
 
 
-    def get_refs(self):
+    def get_refs(self) -> Dict[bytes, bytes]:
         """Get dictionary with all refs.
         """Get dictionary with all refs.
 
 
         Returns: A ``dict`` mapping ref names to SHA1s
         Returns: A ``dict`` mapping ref names to SHA1s
         """
         """
         return self.refs.as_dict()
         return self.refs.as_dict()
 
 
-    def head(self):
+    def head(self) -> bytes:
         """Return the SHA1 pointed at by HEAD."""
         """Return the SHA1 pointed at by HEAD."""
         return self.refs[b'HEAD']
         return self.refs[b'HEAD']
 
 
@@ -529,7 +577,7 @@ class BaseRepo(object):
                   ret.type_name, cls.type_name))
                   ret.type_name, cls.type_name))
         return ret
         return ret
 
 
-    def get_object(self, sha):
+    def get_object(self, sha: bytes) -> ShaFile:
         """Retrieve the object with the specified SHA.
         """Retrieve the object with the specified SHA.
 
 
         Args:
         Args:
@@ -540,7 +588,12 @@ class BaseRepo(object):
         """
         """
         return self.object_store[sha]
         return self.object_store[sha]
 
 
-    def get_parents(self, sha, commit=None):
+    def parents_provider(self):
+        return ParentsProvider(
+            self.object_store, grafts=self._graftpoints,
+            shallows=self.get_shallow())
+
+    def get_parents(self, sha: bytes, commit: Commit = None) -> List[bytes]:
         """Retrieve the parents of a specific commit.
         """Retrieve the parents of a specific commit.
 
 
         If the specific commit is a graftpoint, the graft parents
         If the specific commit is a graftpoint, the graft parents
@@ -551,13 +604,7 @@ class BaseRepo(object):
           commit: Optional commit matching the sha
           commit: Optional commit matching the sha
         Returns: List of parents
         Returns: List of parents
         """
         """
-
-        try:
-            return self._graftpoints[sha]
-        except KeyError:
-            if commit is None:
-                commit = self[sha]
-            return commit.parents
+        return self.parents_provider().get_parents(sha, commit)
 
 
     def get_config(self):
     def get_config(self):
         """Retrieve the config object.
         """Retrieve the config object.
@@ -582,7 +629,7 @@ class BaseRepo(object):
         """
         """
         raise NotImplementedError(self.set_description)
         raise NotImplementedError(self.set_description)
 
 
-    def get_config_stack(self):
+    def get_config_stack(self) -> 'StackedConfig':
         """Return a config stack for this repository.
         """Return a config stack for this repository.
 
 
         This stack accesses the configuration for both this repository
         This stack accesses the configuration for both this repository
@@ -695,18 +742,18 @@ class BaseRepo(object):
         except RefFormatError:
         except RefFormatError:
             raise KeyError(name)
             raise KeyError(name)
 
 
-    def __contains__(self, name):
+    def __contains__(self, name: bytes) -> bool:
         """Check if a specific Git object or ref is present.
         """Check if a specific Git object or ref is present.
 
 
         Args:
         Args:
           name: Git object SHA1 or ref name
           name: Git object SHA1 or ref name
         """
         """
-        if len(name) in (20, 40):
+        if len(name) == 20 or (len(name) == 40 and valid_hexsha(name)):
             return name in self.object_store or name in self.refs
             return name in self.object_store or name in self.refs
         else:
         else:
             return name in self.refs
             return name in self.refs
 
 
-    def __setitem__(self, name, value):
+    def __setitem__(self, name: bytes, value: Union[ShaFile, bytes]):
         """Set a ref.
         """Set a ref.
 
 
         Args:
         Args:
@@ -723,7 +770,7 @@ class BaseRepo(object):
         else:
         else:
             raise ValueError(name)
             raise ValueError(name)
 
 
-    def __delitem__(self, name):
+    def __delitem__(self, name: bytes):
         """Remove a ref.
         """Remove a ref.
 
 
         Args:
         Args:
@@ -734,13 +781,14 @@ class BaseRepo(object):
         else:
         else:
             raise ValueError(name)
             raise ValueError(name)
 
 
-    def _get_user_identity(self, config, kind=None):
+    def _get_user_identity(
+            self, config: 'StackedConfig', kind: str = None) -> bytes:
         """Determine the identity to use for new commits.
         """Determine the identity to use for new commits.
         """
         """
         # TODO(jelmer): Deprecate this function in favor of get_user_identity
         # TODO(jelmer): Deprecate this function in favor of get_user_identity
         return get_user_identity(config)
         return get_user_identity(config)
 
 
-    def _add_graftpoints(self, updated_graftpoints):
+    def _add_graftpoints(self, updated_graftpoints: Dict[bytes, List[bytes]]):
         """Add or modify graftpoints
         """Add or modify graftpoints
 
 
         Args:
         Args:
@@ -754,7 +802,7 @@ class BaseRepo(object):
 
 
         self._graftpoints.update(updated_graftpoints)
         self._graftpoints.update(updated_graftpoints)
 
 
-    def _remove_graftpoints(self, to_remove=[]):
+    def _remove_graftpoints(self, to_remove: List[bytes] = []) -> None:
         """Remove graftpoints
         """Remove graftpoints
 
 
         Args:
         Args:
@@ -777,10 +825,14 @@ class BaseRepo(object):
                   ref=b'HEAD', merge_heads=None):
                   ref=b'HEAD', merge_heads=None):
         """Create a new commit.
         """Create a new commit.
 
 
+        If not specified, `committer` and `author` default to
+        get_user_identity(..., 'COMMITTER')
+        and get_user_identity(..., 'AUTHOR') respectively.
+
         Args:
         Args:
           message: Commit message
           message: Commit message
           committer: Committer fullname
           committer: Committer fullname
-          author: Author fullname (defaults to committer)
+          author: Author fullname
           commit_timestamp: Commit timestamp (defaults to now)
           commit_timestamp: Commit timestamp (defaults to now)
           commit_timezone: Commit timestamp timezone (defaults to GMT)
           commit_timezone: Commit timestamp timezone (defaults to GMT)
           author_timestamp: Author timestamp (defaults to commit
           author_timestamp: Author timestamp (defaults to commit
@@ -792,7 +844,9 @@ class BaseRepo(object):
           encoding: Encoding
           encoding: Encoding
           ref: Optional ref to commit to (defaults to current branch)
           ref: Optional ref to commit to (defaults to current branch)
           merge_heads: Merge heads (defaults to .git/MERGE_HEADS)
           merge_heads: Merge heads (defaults to .git/MERGE_HEADS)
-        Returns: New commit SHA1
+
+        Returns:
+          New commit SHA1
         """
         """
         import time
         import time
         c = Commit()
         c = Commit()
@@ -1093,7 +1147,7 @@ class Repo(BaseRepo):
         """Return path to the index file."""
         """Return path to the index file."""
         return os.path.join(self.controldir(), INDEX_FILENAME)
         return os.path.join(self.controldir(), INDEX_FILENAME)
 
 
-    def open_index(self):
+    def open_index(self) -> 'Index':
         """Open the index for this repository.
         """Open the index for this repository.
 
 
         Raises:
         Raises:
@@ -1241,7 +1295,7 @@ class Repo(BaseRepo):
             honor_filemode=honor_filemode,
             honor_filemode=honor_filemode,
             validate_path_element=validate_path_element)
             validate_path_element=validate_path_element)
 
 
-    def get_config(self):
+    def get_config(self) -> 'ConfigFile':
         """Retrieve the config object.
         """Retrieve the config object.
 
 
         Returns: `ConfigFile` object for the ``.git/config`` file.
         Returns: `ConfigFile` object for the ``.git/config`` file.

+ 31 - 0
dulwich/tests/test_client.py

@@ -48,12 +48,15 @@ from dulwich.client import (
     StrangeHostname,
     StrangeHostname,
     SubprocessSSHVendor,
     SubprocessSSHVendor,
     PLinkSSHVendor,
     PLinkSSHVendor,
+    HangupException,
+    GitProtocolError,
     check_wants,
     check_wants,
     default_urllib3_manager,
     default_urllib3_manager,
     get_credentials_from_store,
     get_credentials_from_store,
     get_transport_and_path,
     get_transport_and_path,
     get_transport_and_path_from_url,
     get_transport_and_path_from_url,
     parse_rsync_url,
     parse_rsync_url,
+    _remote_error_from_stderr,
     )
     )
 from dulwich.config import (
 from dulwich.config import (
     ConfigDict,
     ConfigDict,
@@ -1357,3 +1360,31 @@ class GitCredentialStoreTests(TestCase):
             get_credentials_from_store(
             get_credentials_from_store(
                 b'https', b'example.org', b'otheruser', fnames=[self.fname]),
                 b'https', b'example.org', b'otheruser', fnames=[self.fname]),
             None)
             None)
+
+
+class RemoteErrorFromStderrTests(TestCase):
+
+    def test_nothing(self):
+        self.assertEqual(
+            _remote_error_from_stderr(None), HangupException())
+
+    def test_error_line(self):
+        b = BytesIO(b"""\
+This is some random output.
+ERROR: This is the actual error
+with a tail
+""")
+        self.assertEqual(
+            _remote_error_from_stderr(b),
+            GitProtocolError("This is the actual error"))
+
+    def test_no_error_line(self):
+        b = BytesIO(b"""\
+This is output without an error line.
+And this line is just random noise, too.
+""")
+        self.assertEqual(
+            _remote_error_from_stderr(b),
+            HangupException([
+                b"This is output without an error line.",
+                b"And this line is just random noise, too."]))

+ 14 - 13
dulwich/tests/test_graph.py

@@ -23,7 +23,7 @@
 
 
 from dulwich.tests import TestCase
 from dulwich.tests import TestCase
 from dulwich.tests.utils import make_commit
 from dulwich.tests.utils import make_commit
-from dulwich.object_store import MemoryObjectStore
+from dulwich.repo import MemoryRepo
 
 
 from dulwich.graph import _find_lcas, can_fast_forward
 from dulwich.graph import _find_lcas, can_fast_forward
 
 
@@ -161,24 +161,25 @@ class FindMergeBaseTests(TestCase):
 class CanFastForwardTests(TestCase):
 class CanFastForwardTests(TestCase):
 
 
     def test_ff(self):
     def test_ff(self):
-        store = MemoryObjectStore()
+        r = MemoryRepo()
         base = make_commit()
         base = make_commit()
         c1 = make_commit(parents=[base.id])
         c1 = make_commit(parents=[base.id])
         c2 = make_commit(parents=[c1.id])
         c2 = make_commit(parents=[c1.id])
-        store.add_objects([(base, None), (c1, None), (c2, None)])
-        self.assertTrue(can_fast_forward(store, c1.id, c1.id))
-        self.assertTrue(can_fast_forward(store, base.id, c1.id))
-        self.assertTrue(can_fast_forward(store, c1.id, c2.id))
-        self.assertFalse(can_fast_forward(store, c2.id, c1.id))
+        r.object_store.add_objects([(base, None), (c1, None), (c2, None)])
+        self.assertTrue(can_fast_forward(r, c1.id, c1.id))
+        self.assertTrue(can_fast_forward(r, base.id, c1.id))
+        self.assertTrue(can_fast_forward(r, c1.id, c2.id))
+        self.assertFalse(can_fast_forward(r, c2.id, c1.id))
 
 
     def test_diverged(self):
     def test_diverged(self):
-        store = MemoryObjectStore()
+        r = MemoryRepo()
         base = make_commit()
         base = make_commit()
         c1 = make_commit(parents=[base.id])
         c1 = make_commit(parents=[base.id])
         c2a = make_commit(parents=[c1.id], message=b'2a')
         c2a = make_commit(parents=[c1.id], message=b'2a')
         c2b = make_commit(parents=[c1.id], message=b'2b')
         c2b = make_commit(parents=[c1.id], message=b'2b')
-        store.add_objects([(base, None), (c1, None), (c2a, None), (c2b, None)])
-        self.assertTrue(can_fast_forward(store, c1.id, c2a.id))
-        self.assertTrue(can_fast_forward(store, c1.id, c2b.id))
-        self.assertFalse(can_fast_forward(store, c2a.id, c2b.id))
-        self.assertFalse(can_fast_forward(store, c2b.id, c2a.id))
+        r.object_store.add_objects(
+            [(base, None), (c1, None), (c2a, None), (c2b, None)])
+        self.assertTrue(can_fast_forward(r, c1.id, c2a.id))
+        self.assertTrue(can_fast_forward(r, c1.id, c2b.id))
+        self.assertFalse(can_fast_forward(r, c2a.id, c2b.id))
+        self.assertFalse(can_fast_forward(r, c2b.id, c2a.id))

+ 75 - 0
dulwich/tests/test_porcelain.py

@@ -22,6 +22,7 @@
 
 
 from io import BytesIO, StringIO
 from io import BytesIO, StringIO
 import os
 import os
+import re
 import shutil
 import shutil
 import tarfile
 import tarfile
 import tempfile
 import tempfile
@@ -512,6 +513,20 @@ class RemoveTests(PorcelainTestCase):
         finally:
         finally:
             os.chdir(cwd)
             os.chdir(cwd)
 
 
+    def test_remove_file_removed_on_disk(self):
+        fullpath = os.path.join(self.repo.path, 'foo')
+        with open(fullpath, 'w') as f:
+            f.write("BAR")
+        porcelain.add(self.repo.path, paths=[fullpath])
+        cwd = os.getcwd()
+        try:
+            os.chdir(self.repo.path)
+            os.remove(fullpath)
+            porcelain.remove(self.repo.path, paths=["foo"])
+        finally:
+            os.chdir(cwd)
+        self.assertFalse(os.path.exists(os.path.join(self.repo.path, 'foo')))
+
 
 
 class LogTests(PorcelainTestCase):
 class LogTests(PorcelainTestCase):
 
 
@@ -924,6 +939,53 @@ class PushTests(PorcelainTestCase):
             self.assertEqual(os.path.basename(fullpath),
             self.assertEqual(os.path.basename(fullpath),
                              change.new.path.decode('ascii'))
                              change.new.path.decode('ascii'))
 
 
+    def test_local_missing(self):
+        """Pushing a new branch."""
+        outstream = BytesIO()
+        errstream = BytesIO()
+
+        # Setup target repo cloned from temp test repo
+        clone_path = tempfile.mkdtemp()
+        self.addCleanup(shutil.rmtree, clone_path)
+        target_repo = porcelain.init(clone_path)
+        target_repo.close()
+
+        self.assertRaises(
+            porcelain.Error,
+            porcelain.push, self.repo, clone_path,
+            b"HEAD:refs/heads/master",
+            outstream=outstream, errstream=errstream)
+
+    def test_new(self):
+        """Pushing a new branch."""
+        outstream = BytesIO()
+        errstream = BytesIO()
+
+        # Setup target repo cloned from temp test repo
+        clone_path = tempfile.mkdtemp()
+        self.addCleanup(shutil.rmtree, clone_path)
+        target_repo = porcelain.init(clone_path)
+        target_repo.close()
+
+        # create a second file to be pushed back to origin
+        handle, fullpath = tempfile.mkstemp(dir=clone_path)
+        os.close(handle)
+        porcelain.add(repo=clone_path, paths=[fullpath])
+        new_id = porcelain.commit(
+            repo=self.repo, message=b'push',
+            author=b'author <email>',
+            committer=b'committer <email>')
+
+        # Push to the remote
+        porcelain.push(self.repo, clone_path, b"HEAD:refs/heads/master",
+                       outstream=outstream, errstream=errstream)
+
+        with Repo(clone_path) as r_clone:
+            self.assertEqual({
+                b'HEAD': new_id,
+                b'refs/heads/master': new_id,
+                }, r_clone.get_refs())
+
     def test_delete(self):
     def test_delete(self):
         """Basic test of porcelain push, removing a branch.
         """Basic test of porcelain push, removing a branch.
         """
         """
@@ -981,6 +1043,9 @@ class PushTests(PorcelainTestCase):
             author=b'author <email>',
             author=b'author <email>',
             committer=b'committer <email>')
             committer=b'committer <email>')
 
 
+        outstream = BytesIO()
+        errstream = BytesIO()
+
         # Push to the remote
         # Push to the remote
         self.assertRaises(
         self.assertRaises(
             porcelain.DivergedBranches,
             porcelain.DivergedBranches,
@@ -992,6 +1057,12 @@ class PushTests(PorcelainTestCase):
             b'refs/heads/master': remote_id,
             b'refs/heads/master': remote_id,
             }, self.repo.get_refs())
             }, self.repo.get_refs())
 
 
+        self.assertEqual(b'', outstream.getvalue())
+        self.assertEqual(b'', errstream.getvalue())
+
+        outstream = BytesIO()
+        errstream = BytesIO()
+
         # Push to the remote with --force
         # Push to the remote with --force
         porcelain.push(
         porcelain.push(
             clone_path, self.repo.path, b'refs/heads/master',
             clone_path, self.repo.path, b'refs/heads/master',
@@ -1002,6 +1073,10 @@ class PushTests(PorcelainTestCase):
             b'refs/heads/master': local_id,
             b'refs/heads/master': local_id,
             }, self.repo.get_refs())
             }, self.repo.get_refs())
 
 
+        self.assertEqual(b'', outstream.getvalue())
+        self.assertTrue(
+            re.match(b'Push to .* successful.\n', errstream.getvalue()))
+
 
 
 class PullTests(PorcelainTestCase):
 class PullTests(PorcelainTestCase):
 
 

+ 24 - 0
dulwich/tests/test_refs.py

@@ -323,6 +323,30 @@ class RefsContainerTests(object):
         self.assertNotIn(
         self.assertNotIn(
             b'refs/remotes/origin/other', self._refs)
             b'refs/remotes/origin/other', self._refs)
 
 
+    def test_watch(self):
+        try:
+            watcher = self._refs.watch()
+        except (NotImplementedError, ImportError):
+            self.skipTest('watching not supported')
+        with watcher:
+            self._refs[b'refs/remotes/origin/other'] = (
+                b'48d01bd4b77fed026b154d16493e5deab78f02ec')
+            change = next(watcher)
+            self.assertEqual(
+                (b'refs/remotes/origin/other',
+                 b'48d01bd4b77fed026b154d16493e5deab78f02ec'), change)
+            self._refs[b'refs/remotes/origin/other'] = (
+                b'48d01bd4b77fed026b154d16493e5deab78f02ed')
+            change = next(watcher)
+            self.assertEqual(
+                (b'refs/remotes/origin/other',
+                 b'48d01bd4b77fed026b154d16493e5deab78f02ed'), change)
+            del self._refs[b'refs/remotes/origin/other']
+            change = next(watcher)
+            self.assertEqual(
+                (b'refs/remotes/origin/other',
+                 None), change)
+
 
 
 class DictRefsContainerTests(RefsContainerTests, TestCase):
 class DictRefsContainerTests(RefsContainerTests, TestCase):
 
 

+ 12 - 0
dulwich/tests/test_repository.py

@@ -207,6 +207,7 @@ class RepositoryRootTests(TestCase):
     def test_contains_object(self):
     def test_contains_object(self):
         r = self.open_repo('a.git')
         r = self.open_repo('a.git')
         self.assertTrue(r.head() in r)
         self.assertTrue(r.head() in r)
+        self.assertFalse(b"z" * 40 in r)
 
 
     def test_contains_ref(self):
     def test_contains_ref(self):
         r = self.open_repo('a.git')
         r = self.open_repo('a.git')
@@ -251,6 +252,17 @@ class RepositoryRootTests(TestCase):
         r = self.open_repo('a.git')
         r = self.open_repo('a.git')
         self.assertEqual(r.get_peeled(b'HEAD'), r.head())
         self.assertEqual(r.get_peeled(b'HEAD'), r.head())
 
 
+    def test_get_parents(self):
+        r = self.open_repo('a.git')
+        self.assertEqual(
+            [b'2a72d929692c41d8554c07f6301757ba18a65d91'],
+            r.get_parents(b'a90fa2d900a17e99b433217e988c4eb4a2e9a097'))
+        r.update_shallow(
+                [b'a90fa2d900a17e99b433217e988c4eb4a2e9a097'],
+                None)
+        self.assertEqual(
+            [], r.get_parents(b'a90fa2d900a17e99b433217e988c4eb4a2e9a097'))
+
     def test_get_walker(self):
     def test_get_walker(self):
         r = self.open_repo('a.git')
         r = self.open_repo('a.git')
         # include defaults to [r.head()]
         # include defaults to [r.head()]

+ 4 - 2
setup.py

@@ -14,6 +14,7 @@ from distutils.core import Distribution
 import io
 import io
 import os
 import os
 import sys
 import sys
+from typing import Dict, Any
 
 
 
 
 if sys.version_info < (3, 5):
 if sys.version_info < (3, 5):
@@ -22,7 +23,7 @@ if sys.version_info < (3, 5):
         'For 2.7 support, please install a version prior to 0.20')
         'For 2.7 support, please install a version prior to 0.20')
 
 
 
 
-dulwich_version_string = '0.20.5'
+dulwich_version_string = '0.20.6'
 
 
 
 
 class DulwichDistribution(Distribution):
 class DulwichDistribution(Distribution):
@@ -70,13 +71,14 @@ ext_modules = [
     Extension('dulwich._diff_tree', ['dulwich/_diff_tree.c']),
     Extension('dulwich._diff_tree', ['dulwich/_diff_tree.c']),
 ]
 ]
 
 
-setup_kwargs = {}
+setup_kwargs = {}  # type: Dict[str, Any]
 scripts = ['bin/dul-receive-pack', 'bin/dul-upload-pack']
 scripts = ['bin/dul-receive-pack', 'bin/dul-upload-pack']
 if has_setuptools:
 if has_setuptools:
     setup_kwargs['extras_require'] = {
     setup_kwargs['extras_require'] = {
         'fastimport': ['fastimport'],
         'fastimport': ['fastimport'],
         'https': ['urllib3[secure]>=1.24.1'],
         'https': ['urllib3[secure]>=1.24.1'],
         'pgp': ['gpg'],
         'pgp': ['gpg'],
+        'watch': ['pyinotify'],
         }
         }
     setup_kwargs['install_requires'] = ['urllib3>=1.24.1', 'certifi']
     setup_kwargs['install_requires'] = ['urllib3>=1.24.1', 'certifi']
     setup_kwargs['include_package_data'] = True
     setup_kwargs['include_package_data'] = True

+ 30 - 0
status.yaml

@@ -0,0 +1,30 @@
+---
+configuration:
+ - key: core.compression
+   status: supported
+ - key: core.looseCompression
+   status: supported
+ - key: core.packCompression
+   status: supported
+ - key: core.filemode
+   status: supported
+ - key: http.proxy
+   status: supported
+ - key: http.useragent
+   status: supported
+ - key: http.sslVerify
+   status: supported
+ - key: http.sslCAInfo
+   status: supported
+ - key: i18n.commitEncoding
+   status: supported
+ - key: core.excludsFile
+   status: supported
+ - key: user.name
+   status: supported
+ - key: user.email
+   status: supported
+ - key: core.protectNTFS
+   status: supported
+ - key: core.ignorecase
+   status: supported